00:00:00.000 Started by upstream project "autotest-per-patch" build number 126173 00:00:00.000 originally caused by: 00:00:00.001 Started by upstream project "jbp-per-patch" build number 23928 00:00:00.001 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.059 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.060 The recommended git tool is: git 00:00:00.060 using credential 00000000-0000-0000-0000-000000000002 00:00:00.062 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.107 Fetching changes from the remote Git repository 00:00:00.110 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.180 Using shallow fetch with depth 1 00:00:00.180 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.180 > git --version # timeout=10 00:00:00.227 > git --version # 'git version 2.39.2' 00:00:00.228 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.266 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.266 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/changes/71/24171/1 # timeout=5 00:00:05.594 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:05.609 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:05.622 Checking out Revision f574307dba849e7d22dd5631ce9e594362bd2ebc (FETCH_HEAD) 00:00:05.622 > git config core.sparsecheckout # timeout=10 00:00:05.636 > git read-tree -mu HEAD # timeout=10 00:00:05.654 > git checkout -f f574307dba849e7d22dd5631ce9e594362bd2ebc # timeout=5 00:00:05.679 Commit message: "packer: Drop centos7" 00:00:05.679 > git rev-list --no-walk 1cada6d681c9931648d947263dba569d3956eaf1 # timeout=10 00:00:05.775 [Pipeline] Start of Pipeline 00:00:05.790 [Pipeline] library 00:00:05.792 Loading library shm_lib@master 00:00:05.792 Library shm_lib@master is cached. Copying from home. 00:00:05.816 [Pipeline] node 00:00:05.829 Running on WFP40 in /var/jenkins/workspace/crypto-phy-autotest 00:00:05.831 [Pipeline] { 00:00:05.844 [Pipeline] catchError 00:00:05.846 [Pipeline] { 00:00:05.862 [Pipeline] wrap 00:00:05.871 [Pipeline] { 00:00:05.877 [Pipeline] stage 00:00:05.879 [Pipeline] { (Prologue) 00:00:06.078 [Pipeline] sh 00:00:06.360 + logger -p user.info -t JENKINS-CI 00:00:06.376 [Pipeline] echo 00:00:06.377 Node: WFP40 00:00:06.383 [Pipeline] sh 00:00:06.680 [Pipeline] setCustomBuildProperty 00:00:06.695 [Pipeline] echo 00:00:06.696 Cleanup processes 00:00:06.700 [Pipeline] sh 00:00:06.981 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.981 1289672 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.994 [Pipeline] sh 00:00:07.274 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:07.274 ++ grep -v 'sudo pgrep' 00:00:07.274 ++ awk '{print $1}' 00:00:07.274 + sudo kill -9 00:00:07.274 + true 00:00:07.287 [Pipeline] cleanWs 00:00:07.295 [WS-CLEANUP] Deleting project workspace... 00:00:07.296 [WS-CLEANUP] Deferred wipeout is used... 00:00:07.303 [WS-CLEANUP] done 00:00:07.306 [Pipeline] setCustomBuildProperty 00:00:07.317 [Pipeline] sh 00:00:07.595 + sudo git config --global --replace-all safe.directory '*' 00:00:07.677 [Pipeline] httpRequest 00:00:07.705 [Pipeline] echo 00:00:07.707 Sorcerer 10.211.164.101 is alive 00:00:07.714 [Pipeline] httpRequest 00:00:07.718 HttpMethod: GET 00:00:07.718 URL: http://10.211.164.101/packages/jbp_f574307dba849e7d22dd5631ce9e594362bd2ebc.tar.gz 00:00:07.719 Sending request to url: http://10.211.164.101/packages/jbp_f574307dba849e7d22dd5631ce9e594362bd2ebc.tar.gz 00:00:07.733 Response Code: HTTP/1.1 200 OK 00:00:07.734 Success: Status code 200 is in the accepted range: 200,404 00:00:07.735 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_f574307dba849e7d22dd5631ce9e594362bd2ebc.tar.gz 00:00:11.475 [Pipeline] sh 00:00:11.759 + tar --no-same-owner -xf jbp_f574307dba849e7d22dd5631ce9e594362bd2ebc.tar.gz 00:00:11.777 [Pipeline] httpRequest 00:00:11.805 [Pipeline] echo 00:00:11.807 Sorcerer 10.211.164.101 is alive 00:00:11.817 [Pipeline] httpRequest 00:00:11.822 HttpMethod: GET 00:00:11.823 URL: http://10.211.164.101/packages/spdk_2728651eeb6994be786e188da61cae84c5bb49ac.tar.gz 00:00:11.824 Sending request to url: http://10.211.164.101/packages/spdk_2728651eeb6994be786e188da61cae84c5bb49ac.tar.gz 00:00:11.848 Response Code: HTTP/1.1 200 OK 00:00:11.849 Success: Status code 200 is in the accepted range: 200,404 00:00:11.849 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_2728651eeb6994be786e188da61cae84c5bb49ac.tar.gz 00:01:08.758 [Pipeline] sh 00:01:09.043 + tar --no-same-owner -xf spdk_2728651eeb6994be786e188da61cae84c5bb49ac.tar.gz 00:01:13.249 [Pipeline] sh 00:01:13.533 + git -C spdk log --oneline -n5 00:01:13.533 2728651ee accel: adjust task per ch define name 00:01:13.533 e7cce062d Examples/Perf: correct the calculation of total bandwidth 00:01:13.533 3b4b1d00c libvfio-user: bump MAX_DMA_REGIONS 00:01:13.533 32a79de81 lib/event: add disable_cpumask_locks to spdk_app_opts 00:01:13.533 719d03c6a sock/uring: only register net impl if supported 00:01:13.546 [Pipeline] } 00:01:13.564 [Pipeline] // stage 00:01:13.574 [Pipeline] stage 00:01:13.577 [Pipeline] { (Prepare) 00:01:13.623 [Pipeline] writeFile 00:01:13.642 [Pipeline] sh 00:01:13.924 + logger -p user.info -t JENKINS-CI 00:01:13.935 [Pipeline] sh 00:01:14.217 + logger -p user.info -t JENKINS-CI 00:01:14.229 [Pipeline] sh 00:01:14.512 + cat autorun-spdk.conf 00:01:14.512 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:14.512 SPDK_TEST_BLOCKDEV=1 00:01:14.512 SPDK_TEST_ISAL=1 00:01:14.512 SPDK_TEST_CRYPTO=1 00:01:14.512 SPDK_TEST_REDUCE=1 00:01:14.512 SPDK_TEST_VBDEV_COMPRESS=1 00:01:14.512 SPDK_RUN_UBSAN=1 00:01:14.519 RUN_NIGHTLY=0 00:01:14.524 [Pipeline] readFile 00:01:14.552 [Pipeline] withEnv 00:01:14.554 [Pipeline] { 00:01:14.567 [Pipeline] sh 00:01:14.857 + set -ex 00:01:14.858 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:14.858 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:14.858 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:14.858 ++ SPDK_TEST_BLOCKDEV=1 00:01:14.858 ++ SPDK_TEST_ISAL=1 00:01:14.858 ++ SPDK_TEST_CRYPTO=1 00:01:14.858 ++ SPDK_TEST_REDUCE=1 00:01:14.858 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:14.858 ++ SPDK_RUN_UBSAN=1 00:01:14.858 ++ RUN_NIGHTLY=0 00:01:14.858 + case $SPDK_TEST_NVMF_NICS in 00:01:14.858 + DRIVERS= 00:01:14.858 + [[ -n '' ]] 00:01:14.858 + exit 0 00:01:14.878 [Pipeline] } 00:01:14.922 [Pipeline] // withEnv 00:01:14.945 [Pipeline] } 00:01:14.961 [Pipeline] // stage 00:01:14.969 [Pipeline] catchError 00:01:14.970 [Pipeline] { 00:01:14.982 [Pipeline] timeout 00:01:14.982 Timeout set to expire in 40 min 00:01:14.984 [Pipeline] { 00:01:14.996 [Pipeline] stage 00:01:14.998 [Pipeline] { (Tests) 00:01:15.009 [Pipeline] sh 00:01:15.338 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:15.338 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:15.338 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:15.338 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:15.338 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:15.338 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:15.338 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:15.338 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:15.338 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:15.338 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:15.338 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:15.338 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:15.338 + source /etc/os-release 00:01:15.338 ++ NAME='Fedora Linux' 00:01:15.338 ++ VERSION='38 (Cloud Edition)' 00:01:15.338 ++ ID=fedora 00:01:15.338 ++ VERSION_ID=38 00:01:15.338 ++ VERSION_CODENAME= 00:01:15.338 ++ PLATFORM_ID=platform:f38 00:01:15.338 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:15.338 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:15.338 ++ LOGO=fedora-logo-icon 00:01:15.338 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:15.338 ++ HOME_URL=https://fedoraproject.org/ 00:01:15.338 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:15.338 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:15.338 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:15.338 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:15.338 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:15.338 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:15.338 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:15.338 ++ SUPPORT_END=2024-05-14 00:01:15.338 ++ VARIANT='Cloud Edition' 00:01:15.338 ++ VARIANT_ID=cloud 00:01:15.338 + uname -a 00:01:15.338 Linux spdk-wfp-40 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:15.338 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:18.625 Hugepages 00:01:18.625 node hugesize free / total 00:01:18.625 node0 1048576kB 0 / 0 00:01:18.625 node0 2048kB 0 / 0 00:01:18.625 node1 1048576kB 0 / 0 00:01:18.625 node1 2048kB 0 / 0 00:01:18.625 00:01:18.625 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:18.625 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:01:18.625 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:01:18.625 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:01:18.625 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:01:18.625 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:01:18.625 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:01:18.625 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:01:18.625 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:01:18.625 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:18.625 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:01:18.625 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:01:18.625 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:01:18.625 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:01:18.625 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:01:18.625 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:01:18.625 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:01:18.625 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:01:18.625 + rm -f /tmp/spdk-ld-path 00:01:18.625 + source autorun-spdk.conf 00:01:18.625 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:18.625 ++ SPDK_TEST_BLOCKDEV=1 00:01:18.625 ++ SPDK_TEST_ISAL=1 00:01:18.625 ++ SPDK_TEST_CRYPTO=1 00:01:18.625 ++ SPDK_TEST_REDUCE=1 00:01:18.625 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:18.625 ++ SPDK_RUN_UBSAN=1 00:01:18.625 ++ RUN_NIGHTLY=0 00:01:18.625 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:18.625 + [[ -n '' ]] 00:01:18.626 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:18.626 + for M in /var/spdk/build-*-manifest.txt 00:01:18.626 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:18.626 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:18.626 + for M in /var/spdk/build-*-manifest.txt 00:01:18.626 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:18.626 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:18.626 ++ uname 00:01:18.626 + [[ Linux == \L\i\n\u\x ]] 00:01:18.626 + sudo dmesg -T 00:01:18.626 + sudo dmesg --clear 00:01:18.626 + dmesg_pid=1291144 00:01:18.626 + [[ Fedora Linux == FreeBSD ]] 00:01:18.626 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:18.626 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:18.626 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:18.626 + export VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:18.626 + VM_IMAGE=/var/spdk/dependencies/vhost/spdk_test_image.qcow2 00:01:18.626 + [[ -x /usr/src/fio-static/fio ]] 00:01:18.626 + export FIO_BIN=/usr/src/fio-static/fio 00:01:18.626 + FIO_BIN=/usr/src/fio-static/fio 00:01:18.626 + sudo dmesg -Tw 00:01:18.626 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:18.626 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:18.626 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:18.626 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:18.626 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:18.626 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:18.626 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:18.626 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:18.626 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:18.626 Test configuration: 00:01:18.626 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:18.626 SPDK_TEST_BLOCKDEV=1 00:01:18.626 SPDK_TEST_ISAL=1 00:01:18.626 SPDK_TEST_CRYPTO=1 00:01:18.626 SPDK_TEST_REDUCE=1 00:01:18.626 SPDK_TEST_VBDEV_COMPRESS=1 00:01:18.626 SPDK_RUN_UBSAN=1 00:01:18.885 RUN_NIGHTLY=0 11:42:32 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:18.885 11:42:32 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:18.885 11:42:32 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:18.885 11:42:32 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:18.885 11:42:32 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:18.885 11:42:32 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:18.885 11:42:32 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:18.885 11:42:32 -- paths/export.sh@5 -- $ export PATH 00:01:18.885 11:42:32 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:18.885 11:42:32 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:18.885 11:42:32 -- common/autobuild_common.sh@444 -- $ date +%s 00:01:18.885 11:42:32 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721036552.XXXXXX 00:01:18.885 11:42:32 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721036552.0sQwUm 00:01:18.885 11:42:32 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:01:18.885 11:42:32 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:01:18.885 11:42:32 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:18.885 11:42:32 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:18.885 11:42:32 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:18.885 11:42:32 -- common/autobuild_common.sh@460 -- $ get_config_params 00:01:18.885 11:42:32 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:01:18.885 11:42:32 -- common/autotest_common.sh@10 -- $ set +x 00:01:18.885 11:42:32 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:18.885 11:42:32 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:01:18.885 11:42:32 -- pm/common@17 -- $ local monitor 00:01:18.885 11:42:32 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:18.885 11:42:32 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:18.885 11:42:32 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:18.885 11:42:32 -- pm/common@21 -- $ date +%s 00:01:18.885 11:42:32 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:18.885 11:42:32 -- pm/common@21 -- $ date +%s 00:01:18.885 11:42:32 -- pm/common@25 -- $ sleep 1 00:01:18.886 11:42:32 -- pm/common@21 -- $ date +%s 00:01:18.886 11:42:32 -- pm/common@21 -- $ date +%s 00:01:18.886 11:42:32 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721036552 00:01:18.886 11:42:32 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721036552 00:01:18.886 11:42:32 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721036552 00:01:18.886 11:42:32 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721036552 00:01:18.886 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721036552_collect-vmstat.pm.log 00:01:18.886 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721036552_collect-cpu-load.pm.log 00:01:18.886 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721036552_collect-cpu-temp.pm.log 00:01:18.886 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721036552_collect-bmc-pm.bmc.pm.log 00:01:19.823 11:42:33 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:01:19.823 11:42:33 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:19.824 11:42:33 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:19.824 11:42:33 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:19.824 11:42:33 -- spdk/autobuild.sh@16 -- $ date -u 00:01:19.824 Mon Jul 15 09:42:33 AM UTC 2024 00:01:19.824 11:42:33 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:19.824 v24.09-pre-206-g2728651ee 00:01:19.824 11:42:33 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:19.824 11:42:33 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:19.824 11:42:33 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:19.824 11:42:33 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:19.824 11:42:33 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:19.824 11:42:33 -- common/autotest_common.sh@10 -- $ set +x 00:01:19.824 ************************************ 00:01:19.824 START TEST ubsan 00:01:19.824 ************************************ 00:01:19.824 11:42:33 ubsan -- common/autotest_common.sh@1123 -- $ echo 'using ubsan' 00:01:19.824 using ubsan 00:01:19.824 00:01:19.824 real 0m0.001s 00:01:19.824 user 0m0.000s 00:01:19.824 sys 0m0.000s 00:01:19.824 11:42:33 ubsan -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:01:19.824 11:42:33 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:19.824 ************************************ 00:01:19.824 END TEST ubsan 00:01:19.824 ************************************ 00:01:20.084 11:42:33 -- common/autotest_common.sh@1142 -- $ return 0 00:01:20.084 11:42:33 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:20.084 11:42:33 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:20.084 11:42:33 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:20.084 11:42:33 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:20.084 11:42:33 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:20.084 11:42:33 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:20.084 11:42:33 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:20.084 11:42:33 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:20.084 11:42:33 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:20.084 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:20.084 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:20.343 Using 'verbs' RDMA provider 00:01:36.615 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:51.501 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:52.070 Creating mk/config.mk...done. 00:01:52.070 Creating mk/cc.flags.mk...done. 00:01:52.070 Type 'make' to build. 00:01:52.070 11:43:05 -- spdk/autobuild.sh@69 -- $ run_test make make -j72 00:01:52.070 11:43:05 -- common/autotest_common.sh@1099 -- $ '[' 3 -le 1 ']' 00:01:52.070 11:43:05 -- common/autotest_common.sh@1105 -- $ xtrace_disable 00:01:52.070 11:43:05 -- common/autotest_common.sh@10 -- $ set +x 00:01:52.070 ************************************ 00:01:52.070 START TEST make 00:01:52.070 ************************************ 00:01:52.070 11:43:05 make -- common/autotest_common.sh@1123 -- $ make -j72 00:01:52.330 make[1]: Nothing to be done for 'all'. 00:02:31.072 The Meson build system 00:02:31.072 Version: 1.3.1 00:02:31.072 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:31.072 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:31.072 Build type: native build 00:02:31.072 Program cat found: YES (/usr/bin/cat) 00:02:31.072 Project name: DPDK 00:02:31.072 Project version: 24.03.0 00:02:31.072 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:31.072 C linker for the host machine: cc ld.bfd 2.39-16 00:02:31.072 Host machine cpu family: x86_64 00:02:31.072 Host machine cpu: x86_64 00:02:31.072 Message: ## Building in Developer Mode ## 00:02:31.072 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:31.072 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:31.072 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:31.072 Program python3 found: YES (/usr/bin/python3) 00:02:31.072 Program cat found: YES (/usr/bin/cat) 00:02:31.072 Compiler for C supports arguments -march=native: YES 00:02:31.072 Checking for size of "void *" : 8 00:02:31.072 Checking for size of "void *" : 8 (cached) 00:02:31.072 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:31.072 Library m found: YES 00:02:31.072 Library numa found: YES 00:02:31.072 Has header "numaif.h" : YES 00:02:31.072 Library fdt found: NO 00:02:31.072 Library execinfo found: NO 00:02:31.072 Has header "execinfo.h" : YES 00:02:31.072 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:31.072 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:31.072 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:31.072 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:31.072 Run-time dependency openssl found: YES 3.0.9 00:02:31.072 Run-time dependency libpcap found: YES 1.10.4 00:02:31.072 Has header "pcap.h" with dependency libpcap: YES 00:02:31.072 Compiler for C supports arguments -Wcast-qual: YES 00:02:31.072 Compiler for C supports arguments -Wdeprecated: YES 00:02:31.072 Compiler for C supports arguments -Wformat: YES 00:02:31.072 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:31.072 Compiler for C supports arguments -Wformat-security: NO 00:02:31.072 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:31.072 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:31.072 Compiler for C supports arguments -Wnested-externs: YES 00:02:31.072 Compiler for C supports arguments -Wold-style-definition: YES 00:02:31.072 Compiler for C supports arguments -Wpointer-arith: YES 00:02:31.072 Compiler for C supports arguments -Wsign-compare: YES 00:02:31.072 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:31.072 Compiler for C supports arguments -Wundef: YES 00:02:31.072 Compiler for C supports arguments -Wwrite-strings: YES 00:02:31.072 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:31.072 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:31.072 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:31.072 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:31.072 Program objdump found: YES (/usr/bin/objdump) 00:02:31.072 Compiler for C supports arguments -mavx512f: YES 00:02:31.072 Checking if "AVX512 checking" compiles: YES 00:02:31.072 Fetching value of define "__SSE4_2__" : 1 00:02:31.072 Fetching value of define "__AES__" : 1 00:02:31.072 Fetching value of define "__AVX__" : 1 00:02:31.072 Fetching value of define "__AVX2__" : 1 00:02:31.072 Fetching value of define "__AVX512BW__" : 1 00:02:31.072 Fetching value of define "__AVX512CD__" : 1 00:02:31.072 Fetching value of define "__AVX512DQ__" : 1 00:02:31.072 Fetching value of define "__AVX512F__" : 1 00:02:31.072 Fetching value of define "__AVX512VL__" : 1 00:02:31.072 Fetching value of define "__PCLMUL__" : 1 00:02:31.072 Fetching value of define "__RDRND__" : 1 00:02:31.072 Fetching value of define "__RDSEED__" : 1 00:02:31.073 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:02:31.073 Fetching value of define "__znver1__" : (undefined) 00:02:31.073 Fetching value of define "__znver2__" : (undefined) 00:02:31.073 Fetching value of define "__znver3__" : (undefined) 00:02:31.073 Fetching value of define "__znver4__" : (undefined) 00:02:31.073 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:31.073 Message: lib/log: Defining dependency "log" 00:02:31.073 Message: lib/kvargs: Defining dependency "kvargs" 00:02:31.073 Message: lib/telemetry: Defining dependency "telemetry" 00:02:31.073 Checking for function "getentropy" : NO 00:02:31.073 Message: lib/eal: Defining dependency "eal" 00:02:31.073 Message: lib/ring: Defining dependency "ring" 00:02:31.073 Message: lib/rcu: Defining dependency "rcu" 00:02:31.073 Message: lib/mempool: Defining dependency "mempool" 00:02:31.073 Message: lib/mbuf: Defining dependency "mbuf" 00:02:31.073 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:31.073 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:31.073 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:31.073 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:31.073 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:31.073 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:02:31.073 Compiler for C supports arguments -mpclmul: YES 00:02:31.073 Compiler for C supports arguments -maes: YES 00:02:31.073 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:31.073 Compiler for C supports arguments -mavx512bw: YES 00:02:31.073 Compiler for C supports arguments -mavx512dq: YES 00:02:31.073 Compiler for C supports arguments -mavx512vl: YES 00:02:31.073 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:31.073 Compiler for C supports arguments -mavx2: YES 00:02:31.073 Compiler for C supports arguments -mavx: YES 00:02:31.073 Message: lib/net: Defining dependency "net" 00:02:31.073 Message: lib/meter: Defining dependency "meter" 00:02:31.073 Message: lib/ethdev: Defining dependency "ethdev" 00:02:31.073 Message: lib/pci: Defining dependency "pci" 00:02:31.073 Message: lib/cmdline: Defining dependency "cmdline" 00:02:31.073 Message: lib/hash: Defining dependency "hash" 00:02:31.073 Message: lib/timer: Defining dependency "timer" 00:02:31.073 Message: lib/compressdev: Defining dependency "compressdev" 00:02:31.073 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:31.073 Message: lib/dmadev: Defining dependency "dmadev" 00:02:31.073 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:31.073 Message: lib/power: Defining dependency "power" 00:02:31.073 Message: lib/reorder: Defining dependency "reorder" 00:02:31.073 Message: lib/security: Defining dependency "security" 00:02:31.073 Has header "linux/userfaultfd.h" : YES 00:02:31.073 Has header "linux/vduse.h" : YES 00:02:31.073 Message: lib/vhost: Defining dependency "vhost" 00:02:31.073 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:31.073 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:31.073 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:31.073 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:31.073 Compiler for C supports arguments -std=c11: YES 00:02:31.073 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:31.073 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:31.073 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:31.073 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:31.073 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:31.073 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:31.073 Library mtcr_ul found: NO 00:02:31.073 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:31.073 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:31.073 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:31.073 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:36.343 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:36.343 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:36.344 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:36.344 Configuring mlx5_autoconf.h using configuration 00:02:36.344 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:36.344 Run-time dependency libcrypto found: YES 3.0.9 00:02:36.344 Library IPSec_MB found: YES 00:02:36.344 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:36.344 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:36.344 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:36.344 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:36.344 Library IPSec_MB found: YES 00:02:36.344 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:36.344 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:36.344 Compiler for C supports arguments -std=c11: YES (cached) 00:02:36.344 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:36.344 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:36.344 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:36.344 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:36.344 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:36.344 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:36.344 Library libisal found: NO 00:02:36.344 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:36.344 Compiler for C supports arguments -std=c11: YES (cached) 00:02:36.344 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:36.344 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:36.344 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:36.344 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:36.344 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:36.344 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:36.344 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:36.344 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:36.344 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:36.344 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:36.344 Program doxygen found: YES (/usr/bin/doxygen) 00:02:36.344 Configuring doxy-api-html.conf using configuration 00:02:36.344 Configuring doxy-api-man.conf using configuration 00:02:36.344 Program mandb found: YES (/usr/bin/mandb) 00:02:36.344 Program sphinx-build found: NO 00:02:36.344 Configuring rte_build_config.h using configuration 00:02:36.344 Message: 00:02:36.344 ================= 00:02:36.344 Applications Enabled 00:02:36.344 ================= 00:02:36.344 00:02:36.344 apps: 00:02:36.344 00:02:36.344 00:02:36.344 Message: 00:02:36.344 ================= 00:02:36.344 Libraries Enabled 00:02:36.344 ================= 00:02:36.344 00:02:36.344 libs: 00:02:36.344 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:36.344 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:36.344 cryptodev, dmadev, power, reorder, security, vhost, 00:02:36.344 00:02:36.344 Message: 00:02:36.344 =============== 00:02:36.344 Drivers Enabled 00:02:36.344 =============== 00:02:36.344 00:02:36.344 common: 00:02:36.344 mlx5, qat, 00:02:36.344 bus: 00:02:36.344 auxiliary, pci, vdev, 00:02:36.344 mempool: 00:02:36.344 ring, 00:02:36.344 dma: 00:02:36.344 00:02:36.344 net: 00:02:36.344 00:02:36.344 crypto: 00:02:36.344 ipsec_mb, mlx5, 00:02:36.344 compress: 00:02:36.344 isal, mlx5, 00:02:36.344 vdpa: 00:02:36.344 00:02:36.344 00:02:36.344 Message: 00:02:36.344 ================= 00:02:36.344 Content Skipped 00:02:36.344 ================= 00:02:36.344 00:02:36.344 apps: 00:02:36.344 dumpcap: explicitly disabled via build config 00:02:36.344 graph: explicitly disabled via build config 00:02:36.344 pdump: explicitly disabled via build config 00:02:36.344 proc-info: explicitly disabled via build config 00:02:36.344 test-acl: explicitly disabled via build config 00:02:36.344 test-bbdev: explicitly disabled via build config 00:02:36.344 test-cmdline: explicitly disabled via build config 00:02:36.344 test-compress-perf: explicitly disabled via build config 00:02:36.344 test-crypto-perf: explicitly disabled via build config 00:02:36.344 test-dma-perf: explicitly disabled via build config 00:02:36.344 test-eventdev: explicitly disabled via build config 00:02:36.344 test-fib: explicitly disabled via build config 00:02:36.344 test-flow-perf: explicitly disabled via build config 00:02:36.344 test-gpudev: explicitly disabled via build config 00:02:36.344 test-mldev: explicitly disabled via build config 00:02:36.344 test-pipeline: explicitly disabled via build config 00:02:36.344 test-pmd: explicitly disabled via build config 00:02:36.344 test-regex: explicitly disabled via build config 00:02:36.344 test-sad: explicitly disabled via build config 00:02:36.344 test-security-perf: explicitly disabled via build config 00:02:36.344 00:02:36.344 libs: 00:02:36.344 argparse: explicitly disabled via build config 00:02:36.344 metrics: explicitly disabled via build config 00:02:36.344 acl: explicitly disabled via build config 00:02:36.344 bbdev: explicitly disabled via build config 00:02:36.344 bitratestats: explicitly disabled via build config 00:02:36.344 bpf: explicitly disabled via build config 00:02:36.344 cfgfile: explicitly disabled via build config 00:02:36.344 distributor: explicitly disabled via build config 00:02:36.344 efd: explicitly disabled via build config 00:02:36.344 eventdev: explicitly disabled via build config 00:02:36.344 dispatcher: explicitly disabled via build config 00:02:36.344 gpudev: explicitly disabled via build config 00:02:36.344 gro: explicitly disabled via build config 00:02:36.344 gso: explicitly disabled via build config 00:02:36.344 ip_frag: explicitly disabled via build config 00:02:36.344 jobstats: explicitly disabled via build config 00:02:36.344 latencystats: explicitly disabled via build config 00:02:36.344 lpm: explicitly disabled via build config 00:02:36.344 member: explicitly disabled via build config 00:02:36.344 pcapng: explicitly disabled via build config 00:02:36.344 rawdev: explicitly disabled via build config 00:02:36.344 regexdev: explicitly disabled via build config 00:02:36.344 mldev: explicitly disabled via build config 00:02:36.344 rib: explicitly disabled via build config 00:02:36.344 sched: explicitly disabled via build config 00:02:36.344 stack: explicitly disabled via build config 00:02:36.344 ipsec: explicitly disabled via build config 00:02:36.344 pdcp: explicitly disabled via build config 00:02:36.344 fib: explicitly disabled via build config 00:02:36.344 port: explicitly disabled via build config 00:02:36.344 pdump: explicitly disabled via build config 00:02:36.344 table: explicitly disabled via build config 00:02:36.344 pipeline: explicitly disabled via build config 00:02:36.344 graph: explicitly disabled via build config 00:02:36.344 node: explicitly disabled via build config 00:02:36.344 00:02:36.344 drivers: 00:02:36.344 common/cpt: not in enabled drivers build config 00:02:36.344 common/dpaax: not in enabled drivers build config 00:02:36.344 common/iavf: not in enabled drivers build config 00:02:36.344 common/idpf: not in enabled drivers build config 00:02:36.344 common/ionic: not in enabled drivers build config 00:02:36.344 common/mvep: not in enabled drivers build config 00:02:36.344 common/octeontx: not in enabled drivers build config 00:02:36.344 bus/cdx: not in enabled drivers build config 00:02:36.344 bus/dpaa: not in enabled drivers build config 00:02:36.344 bus/fslmc: not in enabled drivers build config 00:02:36.344 bus/ifpga: not in enabled drivers build config 00:02:36.344 bus/platform: not in enabled drivers build config 00:02:36.344 bus/uacce: not in enabled drivers build config 00:02:36.344 bus/vmbus: not in enabled drivers build config 00:02:36.344 common/cnxk: not in enabled drivers build config 00:02:36.344 common/nfp: not in enabled drivers build config 00:02:36.344 common/nitrox: not in enabled drivers build config 00:02:36.344 common/sfc_efx: not in enabled drivers build config 00:02:36.344 mempool/bucket: not in enabled drivers build config 00:02:36.344 mempool/cnxk: not in enabled drivers build config 00:02:36.344 mempool/dpaa: not in enabled drivers build config 00:02:36.344 mempool/dpaa2: not in enabled drivers build config 00:02:36.344 mempool/octeontx: not in enabled drivers build config 00:02:36.344 mempool/stack: not in enabled drivers build config 00:02:36.344 dma/cnxk: not in enabled drivers build config 00:02:36.344 dma/dpaa: not in enabled drivers build config 00:02:36.344 dma/dpaa2: not in enabled drivers build config 00:02:36.344 dma/hisilicon: not in enabled drivers build config 00:02:36.344 dma/idxd: not in enabled drivers build config 00:02:36.344 dma/ioat: not in enabled drivers build config 00:02:36.344 dma/skeleton: not in enabled drivers build config 00:02:36.344 net/af_packet: not in enabled drivers build config 00:02:36.344 net/af_xdp: not in enabled drivers build config 00:02:36.344 net/ark: not in enabled drivers build config 00:02:36.344 net/atlantic: not in enabled drivers build config 00:02:36.344 net/avp: not in enabled drivers build config 00:02:36.344 net/axgbe: not in enabled drivers build config 00:02:36.345 net/bnx2x: not in enabled drivers build config 00:02:36.345 net/bnxt: not in enabled drivers build config 00:02:36.345 net/bonding: not in enabled drivers build config 00:02:36.345 net/cnxk: not in enabled drivers build config 00:02:36.345 net/cpfl: not in enabled drivers build config 00:02:36.345 net/cxgbe: not in enabled drivers build config 00:02:36.345 net/dpaa: not in enabled drivers build config 00:02:36.345 net/dpaa2: not in enabled drivers build config 00:02:36.345 net/e1000: not in enabled drivers build config 00:02:36.345 net/ena: not in enabled drivers build config 00:02:36.345 net/enetc: not in enabled drivers build config 00:02:36.345 net/enetfec: not in enabled drivers build config 00:02:36.345 net/enic: not in enabled drivers build config 00:02:36.345 net/failsafe: not in enabled drivers build config 00:02:36.345 net/fm10k: not in enabled drivers build config 00:02:36.345 net/gve: not in enabled drivers build config 00:02:36.345 net/hinic: not in enabled drivers build config 00:02:36.345 net/hns3: not in enabled drivers build config 00:02:36.345 net/i40e: not in enabled drivers build config 00:02:36.345 net/iavf: not in enabled drivers build config 00:02:36.345 net/ice: not in enabled drivers build config 00:02:36.345 net/idpf: not in enabled drivers build config 00:02:36.345 net/igc: not in enabled drivers build config 00:02:36.345 net/ionic: not in enabled drivers build config 00:02:36.345 net/ipn3ke: not in enabled drivers build config 00:02:36.345 net/ixgbe: not in enabled drivers build config 00:02:36.345 net/mana: not in enabled drivers build config 00:02:36.345 net/memif: not in enabled drivers build config 00:02:36.345 net/mlx4: not in enabled drivers build config 00:02:36.345 net/mlx5: not in enabled drivers build config 00:02:36.345 net/mvneta: not in enabled drivers build config 00:02:36.345 net/mvpp2: not in enabled drivers build config 00:02:36.345 net/netvsc: not in enabled drivers build config 00:02:36.345 net/nfb: not in enabled drivers build config 00:02:36.345 net/nfp: not in enabled drivers build config 00:02:36.345 net/ngbe: not in enabled drivers build config 00:02:36.345 net/null: not in enabled drivers build config 00:02:36.345 net/octeontx: not in enabled drivers build config 00:02:36.345 net/octeon_ep: not in enabled drivers build config 00:02:36.345 net/pcap: not in enabled drivers build config 00:02:36.345 net/pfe: not in enabled drivers build config 00:02:36.345 net/qede: not in enabled drivers build config 00:02:36.345 net/ring: not in enabled drivers build config 00:02:36.345 net/sfc: not in enabled drivers build config 00:02:36.345 net/softnic: not in enabled drivers build config 00:02:36.345 net/tap: not in enabled drivers build config 00:02:36.345 net/thunderx: not in enabled drivers build config 00:02:36.345 net/txgbe: not in enabled drivers build config 00:02:36.345 net/vdev_netvsc: not in enabled drivers build config 00:02:36.345 net/vhost: not in enabled drivers build config 00:02:36.345 net/virtio: not in enabled drivers build config 00:02:36.345 net/vmxnet3: not in enabled drivers build config 00:02:36.345 raw/*: missing internal dependency, "rawdev" 00:02:36.345 crypto/armv8: not in enabled drivers build config 00:02:36.345 crypto/bcmfs: not in enabled drivers build config 00:02:36.345 crypto/caam_jr: not in enabled drivers build config 00:02:36.345 crypto/ccp: not in enabled drivers build config 00:02:36.345 crypto/cnxk: not in enabled drivers build config 00:02:36.345 crypto/dpaa_sec: not in enabled drivers build config 00:02:36.345 crypto/dpaa2_sec: not in enabled drivers build config 00:02:36.345 crypto/mvsam: not in enabled drivers build config 00:02:36.345 crypto/nitrox: not in enabled drivers build config 00:02:36.345 crypto/null: not in enabled drivers build config 00:02:36.345 crypto/octeontx: not in enabled drivers build config 00:02:36.345 crypto/openssl: not in enabled drivers build config 00:02:36.345 crypto/scheduler: not in enabled drivers build config 00:02:36.345 crypto/uadk: not in enabled drivers build config 00:02:36.345 crypto/virtio: not in enabled drivers build config 00:02:36.345 compress/nitrox: not in enabled drivers build config 00:02:36.345 compress/octeontx: not in enabled drivers build config 00:02:36.345 compress/zlib: not in enabled drivers build config 00:02:36.345 regex/*: missing internal dependency, "regexdev" 00:02:36.345 ml/*: missing internal dependency, "mldev" 00:02:36.345 vdpa/ifc: not in enabled drivers build config 00:02:36.345 vdpa/mlx5: not in enabled drivers build config 00:02:36.345 vdpa/nfp: not in enabled drivers build config 00:02:36.345 vdpa/sfc: not in enabled drivers build config 00:02:36.345 event/*: missing internal dependency, "eventdev" 00:02:36.345 baseband/*: missing internal dependency, "bbdev" 00:02:36.345 gpu/*: missing internal dependency, "gpudev" 00:02:36.345 00:02:36.345 00:02:36.345 Build targets in project: 115 00:02:36.345 00:02:36.345 DPDK 24.03.0 00:02:36.345 00:02:36.345 User defined options 00:02:36.345 buildtype : debug 00:02:36.345 default_library : shared 00:02:36.345 libdir : lib 00:02:36.345 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:36.345 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:36.345 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:36.345 cpu_instruction_set: native 00:02:36.345 disable_apps : dumpcap,graph,pdump,proc-info,test-acl,test-bbdev,test-cmdline,test-compress-perf,test-crypto-perf,test-dma-perf,test-eventdev,test-fib,test-flow-perf,test-gpudev,test-mldev,test-pipeline,test-pmd,test-regex,test-sad,test-security-perf,test 00:02:36.345 disable_libs : acl,argparse,bbdev,bitratestats,bpf,cfgfile,dispatcher,distributor,efd,eventdev,fib,gpudev,graph,gro,gso,ip_frag,ipsec,jobstats,latencystats,lpm,member,metrics,mldev,node,pcapng,pdcp,pdump,pipeline,port,rawdev,regexdev,rib,sched,stack,table 00:02:36.345 enable_docs : false 00:02:36.345 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:36.345 enable_kmods : false 00:02:36.345 max_lcores : 128 00:02:36.345 tests : false 00:02:36.345 00:02:36.345 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:36.610 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:36.610 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:36.610 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:36.610 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:36.610 [4/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:36.869 [5/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:36.869 [6/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:36.869 [7/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:36.869 [8/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:36.869 [9/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:36.869 [10/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:36.869 [11/378] Linking static target lib/librte_kvargs.a 00:02:36.869 [12/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:36.869 [13/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:36.869 [14/378] Linking static target lib/librte_log.a 00:02:36.869 [15/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:36.869 [16/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:36.869 [17/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:36.869 [18/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:36.869 [19/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:37.126 [20/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:37.126 [21/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:37.126 [22/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:37.386 [23/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:37.386 [24/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:37.386 [25/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:37.386 [26/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:37.386 [27/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:37.386 [28/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:37.386 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:37.386 [30/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:37.386 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:37.386 [32/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:37.386 [33/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:37.386 [34/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:37.386 [35/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:37.386 [36/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:37.386 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:37.387 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:37.387 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:37.387 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:37.387 [41/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:37.387 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:37.387 [43/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:37.387 [44/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:37.387 [45/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:37.387 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:37.387 [47/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:37.387 [48/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:37.387 [49/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:37.387 [50/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:37.387 [51/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:37.387 [52/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:37.387 [53/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:37.387 [54/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:37.387 [55/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:37.387 [56/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:37.387 [57/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:37.387 [58/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:37.387 [59/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:37.387 [60/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:37.387 [61/378] Linking static target lib/librte_telemetry.a 00:02:37.387 [62/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:37.387 [63/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:37.387 [64/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:37.387 [65/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.387 [66/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:37.387 [67/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:37.387 [68/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:37.387 [69/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:37.387 [70/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:37.387 [71/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:37.387 [72/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:37.387 [73/378] Linking static target lib/librte_ring.a 00:02:37.387 [74/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:37.387 [75/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:37.387 [76/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:37.387 [77/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:37.387 [78/378] Linking static target lib/librte_pci.a 00:02:37.387 [79/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:37.387 [80/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:02:37.387 [81/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:37.387 [82/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:37.387 [83/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:37.387 [84/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:37.387 [85/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:37.387 [86/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:37.387 [87/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:37.387 [88/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:37.387 [89/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:02:37.649 [90/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:37.649 [91/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:37.649 [92/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:37.649 [93/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:37.649 [94/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:37.649 [95/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:37.649 [96/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:37.649 [97/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:37.649 [98/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:37.649 [99/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:37.649 [100/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:37.649 [101/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:37.649 [102/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:37.649 [103/378] Linking static target lib/librte_mempool.a 00:02:37.649 [104/378] Linking static target lib/librte_rcu.a 00:02:37.649 [105/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:37.649 [106/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:37.649 [107/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:37.649 [108/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:37.649 [109/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:37.649 [110/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:37.912 [111/378] Linking static target lib/librte_meter.a 00:02:37.912 [112/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:37.912 [113/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.912 [114/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:37.912 [115/378] Linking target lib/librte_log.so.24.1 00:02:37.912 [116/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:37.912 [117/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:37.912 [118/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:37.912 [119/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:37.912 [120/378] Linking static target lib/librte_mbuf.a 00:02:37.912 [121/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:37.912 [122/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:37.912 [123/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:38.175 [124/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:38.175 [125/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:38.175 [126/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.175 [127/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:38.175 [128/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:38.175 [129/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:38.175 [130/378] Linking static target lib/librte_timer.a 00:02:38.175 [131/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:38.175 [132/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:38.175 [133/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:38.175 [134/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:38.175 [135/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:38.175 [136/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:38.175 [137/378] Linking static target lib/librte_cmdline.a 00:02:38.175 [138/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:38.175 [139/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:38.175 [140/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:38.175 [141/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:38.175 [142/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:38.175 [143/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:38.175 [144/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:38.175 [145/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:38.175 [146/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.175 [147/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:38.175 [148/378] Linking target lib/librte_kvargs.so.24.1 00:02:38.175 [149/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:38.175 [150/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.175 [151/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.175 [152/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:38.175 [153/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:38.175 [154/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:38.175 [155/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:38.175 [156/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:38.175 [157/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:38.175 [158/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:38.175 [159/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:38.175 [160/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:38.175 [161/378] Linking target lib/librte_telemetry.so.24.1 00:02:38.175 [162/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:38.175 [163/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:38.175 [164/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:38.175 [165/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:38.435 [166/378] Linking static target lib/librte_dmadev.a 00:02:38.436 [167/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:38.436 [168/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:38.436 [169/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:38.436 [170/378] Linking static target lib/librte_compressdev.a 00:02:38.436 [171/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:38.436 [172/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:38.436 [173/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:38.436 [174/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:38.436 [175/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:38.436 [176/378] Linking static target lib/librte_power.a 00:02:38.436 [177/378] Linking static target lib/librte_reorder.a 00:02:38.436 [178/378] Linking static target lib/librte_net.a 00:02:38.436 [179/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:38.436 [180/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:38.436 [181/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:38.436 [182/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:38.436 [183/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:38.436 [184/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:38.436 [185/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:38.436 [186/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:38.436 [187/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:38.436 [188/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:38.436 [189/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:38.436 [190/378] Linking static target lib/librte_eal.a 00:02:38.436 [191/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:38.436 [192/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:38.436 [193/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:38.701 [194/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:38.701 [195/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:38.701 [196/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:38.701 [197/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:38.701 [198/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:38.701 [199/378] Linking static target drivers/librte_bus_auxiliary.a 00:02:38.701 [200/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:38.701 [201/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:38.701 [202/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:38.701 [203/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:38.701 [204/378] Linking static target lib/librte_hash.a 00:02:38.961 [205/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:38.961 [206/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.961 [207/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:38.961 [208/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:38.961 [209/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:38.961 [210/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.961 [211/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:38.961 [212/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:38.961 [213/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:38.961 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:38.961 [215/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:38.961 [216/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:38.961 [217/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:38.961 [218/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:38.961 [219/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:38.961 [220/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:38.961 [221/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:38.961 [222/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:38.961 [223/378] Linking static target drivers/librte_bus_pci.a 00:02:38.961 [224/378] Linking static target drivers/librte_bus_vdev.a 00:02:38.961 [225/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:38.961 [226/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:38.961 [227/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:38.961 [228/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:38.961 [229/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:38.961 [230/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:38.961 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:38.961 [232/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.961 [233/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:38.961 [234/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:38.961 [235/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:38.961 [236/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.961 [237/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:38.961 [238/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:38.961 [239/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:38.961 [240/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:38.961 [241/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:38.961 [242/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:38.961 [243/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:39.220 [244/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.220 [245/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:39.220 [246/378] Linking static target lib/librte_cryptodev.a 00:02:39.220 [247/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:39.220 [248/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:39.220 [249/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.220 [250/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:39.220 [251/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:39.220 [252/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:39.220 [253/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.220 [254/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:39.220 [255/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:39.220 [256/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:39.220 [257/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:39.220 [258/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:39.220 [259/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:39.220 [260/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.220 [261/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:39.220 [262/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:39.220 [263/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:39.220 [264/378] Linking static target drivers/librte_mempool_ring.a 00:02:39.478 [265/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:39.478 [266/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:39.478 [267/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:39.478 [268/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:39.478 [269/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:39.478 [270/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.478 [271/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:39.478 [272/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:39.478 [273/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:39.478 [274/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:39.478 [275/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:39.478 [276/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:39.478 [277/378] Linking static target lib/librte_security.a 00:02:39.478 [278/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:39.478 [279/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:39.478 [280/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:39.478 [281/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:39.478 [282/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:39.478 [283/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:39.478 [284/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:39.478 [285/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.736 [286/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:39.736 [287/378] Linking static target lib/librte_ethdev.a 00:02:39.736 [288/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:39.736 [289/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:39.736 [290/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:39.736 [291/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:39.736 [292/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:39.736 [293/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:39.736 [294/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:39.736 [295/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:39.736 [296/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:39.736 [297/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:39.736 [298/378] Linking static target drivers/librte_common_mlx5.a 00:02:39.736 [299/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:39.736 [300/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:39.736 [301/378] Linking static target drivers/librte_compress_isal.a 00:02:39.736 [302/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:39.736 [303/378] Linking static target drivers/librte_crypto_mlx5.a 00:02:39.737 [304/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.995 [305/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:39.995 [306/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:39.995 [307/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:39.995 [308/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:39.995 [309/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:39.995 [310/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:39.995 [311/378] Linking static target drivers/librte_compress_mlx5.a 00:02:39.995 [312/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:39.995 [313/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:39.995 [314/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:40.254 [315/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:40.254 [316/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:40.254 [317/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:40.823 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:40.823 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:02:41.082 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:41.082 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:41.082 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:41.082 [323/378] Linking static target drivers/librte_common_qat.a 00:02:41.341 [324/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:41.341 [325/378] Linking static target lib/librte_vhost.a 00:02:41.341 [326/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:43.876 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:46.468 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:50.664 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:52.045 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:52.045 [331/378] Linking target lib/librte_eal.so.24.1 00:02:52.305 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:52.305 [333/378] Linking target lib/librte_dmadev.so.24.1 00:02:52.305 [334/378] Linking target lib/librte_meter.so.24.1 00:02:52.305 [335/378] Linking target lib/librte_ring.so.24.1 00:02:52.305 [336/378] Linking target lib/librte_pci.so.24.1 00:02:52.305 [337/378] Linking target lib/librte_timer.so.24.1 00:02:52.305 [338/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:52.305 [339/378] Linking target drivers/librte_bus_vdev.so.24.1 00:02:52.564 [340/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:52.564 [341/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:52.564 [342/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:52.564 [343/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:52.564 [344/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:52.564 [345/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:52.564 [346/378] Linking target lib/librte_mempool.so.24.1 00:02:52.564 [347/378] Linking target lib/librte_rcu.so.24.1 00:02:52.564 [348/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:52.824 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:02:52.824 [350/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:52.824 [351/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:52.824 [352/378] Linking target lib/librte_mbuf.so.24.1 00:02:52.824 [353/378] Linking target drivers/librte_mempool_ring.so.24.1 00:02:52.824 [354/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:53.083 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:53.083 [356/378] Linking target lib/librte_net.so.24.1 00:02:53.083 [357/378] Linking target lib/librte_reorder.so.24.1 00:02:53.083 [358/378] Linking target lib/librte_compressdev.so.24.1 00:02:53.083 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:02:53.342 [360/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:53.342 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:53.342 [362/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:53.342 [363/378] Linking target lib/librte_security.so.24.1 00:02:53.342 [364/378] Linking target lib/librte_hash.so.24.1 00:02:53.342 [365/378] Linking target lib/librte_cmdline.so.24.1 00:02:53.342 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:02:53.342 [367/378] Linking target lib/librte_ethdev.so.24.1 00:02:53.342 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:53.342 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:53.342 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:53.601 [371/378] Linking target lib/librte_power.so.24.1 00:02:53.601 [372/378] Linking target lib/librte_vhost.so.24.1 00:02:53.601 [373/378] Linking target drivers/librte_common_mlx5.so.24.1 00:02:53.860 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:53.860 [375/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:53.860 [376/378] Linking target drivers/librte_common_qat.so.24.1 00:02:53.860 [377/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:53.860 [378/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:53.860 INFO: autodetecting backend as ninja 00:02:53.860 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 72 00:02:55.239 CC lib/log/log.o 00:02:55.239 CC lib/log/log_flags.o 00:02:55.239 CC lib/ut_mock/mock.o 00:02:55.239 CC lib/log/log_deprecated.o 00:02:55.239 CC lib/ut/ut.o 00:02:55.239 LIB libspdk_ut.a 00:02:55.239 SO libspdk_ut.so.2.0 00:02:55.239 LIB libspdk_ut_mock.a 00:02:55.498 LIB libspdk_log.a 00:02:55.498 SO libspdk_ut_mock.so.6.0 00:02:55.498 SO libspdk_log.so.7.0 00:02:55.498 SYMLINK libspdk_ut.so 00:02:55.498 SYMLINK libspdk_ut_mock.so 00:02:55.498 SYMLINK libspdk_log.so 00:02:56.067 CC lib/util/base64.o 00:02:56.067 CC lib/util/bit_array.o 00:02:56.067 CC lib/dma/dma.o 00:02:56.067 CC lib/util/cpuset.o 00:02:56.067 CC lib/util/crc16.o 00:02:56.067 CC lib/util/crc32.o 00:02:56.067 CC lib/ioat/ioat.o 00:02:56.067 CC lib/util/crc32c.o 00:02:56.067 CC lib/util/crc32_ieee.o 00:02:56.067 CXX lib/trace_parser/trace.o 00:02:56.067 CC lib/util/crc64.o 00:02:56.067 CC lib/util/dif.o 00:02:56.067 CC lib/util/fd.o 00:02:56.067 CC lib/util/file.o 00:02:56.067 CC lib/util/hexlify.o 00:02:56.067 CC lib/util/iov.o 00:02:56.067 CC lib/util/math.o 00:02:56.067 CC lib/util/pipe.o 00:02:56.067 CC lib/util/strerror_tls.o 00:02:56.067 CC lib/util/string.o 00:02:56.067 CC lib/util/uuid.o 00:02:56.067 CC lib/util/fd_group.o 00:02:56.067 CC lib/util/zipf.o 00:02:56.067 CC lib/util/xor.o 00:02:56.067 CC lib/vfio_user/host/vfio_user_pci.o 00:02:56.067 CC lib/vfio_user/host/vfio_user.o 00:02:56.067 LIB libspdk_dma.a 00:02:56.067 SO libspdk_dma.so.4.0 00:02:56.327 SYMLINK libspdk_dma.so 00:02:56.327 LIB libspdk_ioat.a 00:02:56.327 SO libspdk_ioat.so.7.0 00:02:56.327 SYMLINK libspdk_ioat.so 00:02:56.586 LIB libspdk_util.a 00:02:56.586 SO libspdk_util.so.9.1 00:02:56.586 LIB libspdk_vfio_user.a 00:02:56.586 SO libspdk_vfio_user.so.5.0 00:02:56.845 SYMLINK libspdk_util.so 00:02:56.845 SYMLINK libspdk_vfio_user.so 00:02:56.845 LIB libspdk_trace_parser.a 00:02:56.845 SO libspdk_trace_parser.so.5.0 00:02:57.104 SYMLINK libspdk_trace_parser.so 00:02:57.104 CC lib/json/json_util.o 00:02:57.104 CC lib/json/json_parse.o 00:02:57.104 CC lib/json/json_write.o 00:02:57.104 CC lib/rdma_utils/rdma_utils.o 00:02:57.104 CC lib/rdma_provider/common.o 00:02:57.104 CC lib/idxd/idxd_user.o 00:02:57.104 CC lib/idxd/idxd.o 00:02:57.104 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:57.104 CC lib/env_dpdk/env.o 00:02:57.104 CC lib/idxd/idxd_kernel.o 00:02:57.104 CC lib/conf/conf.o 00:02:57.104 CC lib/env_dpdk/memory.o 00:02:57.104 CC lib/env_dpdk/pci.o 00:02:57.104 CC lib/vmd/vmd.o 00:02:57.104 CC lib/env_dpdk/init.o 00:02:57.104 CC lib/vmd/led.o 00:02:57.104 CC lib/env_dpdk/threads.o 00:02:57.104 CC lib/env_dpdk/pci_ioat.o 00:02:57.104 CC lib/env_dpdk/pci_virtio.o 00:02:57.104 CC lib/env_dpdk/pci_vmd.o 00:02:57.104 CC lib/env_dpdk/pci_idxd.o 00:02:57.104 CC lib/reduce/reduce.o 00:02:57.104 CC lib/env_dpdk/pci_event.o 00:02:57.104 CC lib/env_dpdk/sigbus_handler.o 00:02:57.104 CC lib/env_dpdk/pci_dpdk.o 00:02:57.104 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:57.104 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:57.364 LIB libspdk_conf.a 00:02:57.364 SO libspdk_conf.so.6.0 00:02:57.364 LIB libspdk_rdma_utils.a 00:02:57.364 LIB libspdk_json.a 00:02:57.364 SYMLINK libspdk_conf.so 00:02:57.364 SO libspdk_rdma_utils.so.1.0 00:02:57.623 SO libspdk_json.so.6.0 00:02:57.623 SYMLINK libspdk_rdma_utils.so 00:02:57.623 LIB libspdk_rdma_provider.a 00:02:57.623 SYMLINK libspdk_json.so 00:02:57.623 SO libspdk_rdma_provider.so.6.0 00:02:57.623 SYMLINK libspdk_rdma_provider.so 00:02:57.882 LIB libspdk_idxd.a 00:02:57.882 SO libspdk_idxd.so.12.0 00:02:57.882 LIB libspdk_reduce.a 00:02:57.882 SO libspdk_reduce.so.6.0 00:02:57.882 SYMLINK libspdk_idxd.so 00:02:57.882 SYMLINK libspdk_reduce.so 00:02:57.882 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:57.882 CC lib/jsonrpc/jsonrpc_server.o 00:02:57.882 CC lib/jsonrpc/jsonrpc_client.o 00:02:57.882 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:58.142 LIB libspdk_jsonrpc.a 00:02:58.401 SO libspdk_jsonrpc.so.6.0 00:02:58.401 SYMLINK libspdk_jsonrpc.so 00:02:58.401 LIB libspdk_vmd.a 00:02:58.661 SO libspdk_vmd.so.6.0 00:02:58.661 SYMLINK libspdk_vmd.so 00:02:58.661 CC lib/rpc/rpc.o 00:02:58.921 LIB libspdk_rpc.a 00:02:58.921 SO libspdk_rpc.so.6.0 00:02:59.181 SYMLINK libspdk_rpc.so 00:02:59.441 CC lib/trace/trace.o 00:02:59.441 CC lib/trace/trace_rpc.o 00:02:59.441 CC lib/trace/trace_flags.o 00:02:59.441 CC lib/keyring/keyring.o 00:02:59.441 CC lib/keyring/keyring_rpc.o 00:02:59.441 CC lib/notify/notify.o 00:02:59.441 CC lib/notify/notify_rpc.o 00:02:59.700 LIB libspdk_notify.a 00:02:59.700 LIB libspdk_trace.a 00:02:59.700 SO libspdk_notify.so.6.0 00:02:59.700 LIB libspdk_keyring.a 00:02:59.700 SO libspdk_trace.so.10.0 00:02:59.700 SO libspdk_keyring.so.1.0 00:02:59.700 SYMLINK libspdk_notify.so 00:02:59.700 SYMLINK libspdk_trace.so 00:02:59.958 SYMLINK libspdk_keyring.so 00:03:00.216 CC lib/thread/thread.o 00:03:00.216 CC lib/sock/sock.o 00:03:00.216 CC lib/thread/iobuf.o 00:03:00.216 CC lib/sock/sock_rpc.o 00:03:00.216 LIB libspdk_env_dpdk.a 00:03:00.216 SO libspdk_env_dpdk.so.14.1 00:03:00.476 SYMLINK libspdk_env_dpdk.so 00:03:00.476 LIB libspdk_sock.a 00:03:00.736 SO libspdk_sock.so.10.0 00:03:00.736 SYMLINK libspdk_sock.so 00:03:00.995 CC lib/nvme/nvme_ctrlr_cmd.o 00:03:00.995 CC lib/nvme/nvme_ctrlr.o 00:03:00.995 CC lib/nvme/nvme_fabric.o 00:03:00.995 CC lib/nvme/nvme_ns_cmd.o 00:03:00.995 CC lib/nvme/nvme_ns.o 00:03:00.995 CC lib/nvme/nvme_pcie_common.o 00:03:00.996 CC lib/nvme/nvme_pcie.o 00:03:00.996 CC lib/nvme/nvme_qpair.o 00:03:00.996 CC lib/nvme/nvme.o 00:03:00.996 CC lib/nvme/nvme_quirks.o 00:03:00.996 CC lib/nvme/nvme_transport.o 00:03:00.996 CC lib/nvme/nvme_discovery.o 00:03:00.996 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:03:00.996 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:03:00.996 CC lib/nvme/nvme_tcp.o 00:03:00.996 CC lib/nvme/nvme_opal.o 00:03:00.996 CC lib/nvme/nvme_io_msg.o 00:03:00.996 CC lib/nvme/nvme_poll_group.o 00:03:00.996 CC lib/nvme/nvme_zns.o 00:03:00.996 CC lib/nvme/nvme_stubs.o 00:03:00.996 CC lib/nvme/nvme_auth.o 00:03:00.996 CC lib/nvme/nvme_cuse.o 00:03:00.996 CC lib/nvme/nvme_rdma.o 00:03:01.254 LIB libspdk_thread.a 00:03:01.254 SO libspdk_thread.so.10.1 00:03:01.514 SYMLINK libspdk_thread.so 00:03:01.773 CC lib/init/json_config.o 00:03:01.773 CC lib/init/subsystem.o 00:03:01.773 CC lib/init/rpc.o 00:03:01.773 CC lib/init/subsystem_rpc.o 00:03:01.773 CC lib/virtio/virtio.o 00:03:01.773 CC lib/virtio/virtio_vfio_user.o 00:03:01.773 CC lib/virtio/virtio_vhost_user.o 00:03:01.773 CC lib/virtio/virtio_pci.o 00:03:01.773 CC lib/blob/blobstore.o 00:03:01.773 CC lib/blob/zeroes.o 00:03:01.773 CC lib/blob/request.o 00:03:01.773 CC lib/accel/accel.o 00:03:01.773 CC lib/blob/blob_bs_dev.o 00:03:01.773 CC lib/accel/accel_rpc.o 00:03:01.773 CC lib/accel/accel_sw.o 00:03:02.032 LIB libspdk_init.a 00:03:02.032 SO libspdk_init.so.5.0 00:03:02.032 LIB libspdk_virtio.a 00:03:02.293 SYMLINK libspdk_init.so 00:03:02.293 SO libspdk_virtio.so.7.0 00:03:02.293 SYMLINK libspdk_virtio.so 00:03:02.552 CC lib/event/app.o 00:03:02.552 CC lib/event/reactor.o 00:03:02.552 CC lib/event/app_rpc.o 00:03:02.553 CC lib/event/log_rpc.o 00:03:02.553 CC lib/event/scheduler_static.o 00:03:02.814 LIB libspdk_accel.a 00:03:02.814 SO libspdk_accel.so.15.1 00:03:02.814 SYMLINK libspdk_accel.so 00:03:03.073 LIB libspdk_event.a 00:03:03.073 SO libspdk_event.so.14.0 00:03:03.073 SYMLINK libspdk_event.so 00:03:03.331 CC lib/bdev/bdev.o 00:03:03.331 CC lib/bdev/bdev_rpc.o 00:03:03.331 CC lib/bdev/bdev_zone.o 00:03:03.331 CC lib/bdev/part.o 00:03:03.331 CC lib/bdev/scsi_nvme.o 00:03:03.589 LIB libspdk_nvme.a 00:03:03.589 SO libspdk_nvme.so.13.1 00:03:04.154 SYMLINK libspdk_nvme.so 00:03:04.720 LIB libspdk_blob.a 00:03:04.978 SO libspdk_blob.so.11.0 00:03:04.978 SYMLINK libspdk_blob.so 00:03:05.545 CC lib/blobfs/blobfs.o 00:03:05.545 CC lib/lvol/lvol.o 00:03:05.545 CC lib/blobfs/tree.o 00:03:06.481 LIB libspdk_blobfs.a 00:03:06.481 SO libspdk_blobfs.so.10.0 00:03:06.481 LIB libspdk_lvol.a 00:03:06.481 SO libspdk_lvol.so.10.0 00:03:06.481 SYMLINK libspdk_blobfs.so 00:03:06.481 SYMLINK libspdk_lvol.so 00:03:09.025 LIB libspdk_bdev.a 00:03:09.025 SO libspdk_bdev.so.15.1 00:03:09.025 SYMLINK libspdk_bdev.so 00:03:09.603 CC lib/ublk/ublk.o 00:03:09.603 CC lib/ublk/ublk_rpc.o 00:03:09.603 CC lib/nbd/nbd.o 00:03:09.603 CC lib/scsi/dev.o 00:03:09.603 CC lib/nvmf/ctrlr.o 00:03:09.603 CC lib/scsi/lun.o 00:03:09.603 CC lib/nbd/nbd_rpc.o 00:03:09.603 CC lib/nvmf/ctrlr_discovery.o 00:03:09.603 CC lib/ftl/ftl_core.o 00:03:09.603 CC lib/scsi/port.o 00:03:09.603 CC lib/ftl/ftl_init.o 00:03:09.603 CC lib/nvmf/ctrlr_bdev.o 00:03:09.603 CC lib/scsi/scsi.o 00:03:09.603 CC lib/ftl/ftl_layout.o 00:03:09.603 CC lib/ftl/ftl_io.o 00:03:09.603 CC lib/nvmf/subsystem.o 00:03:09.603 CC lib/ftl/ftl_debug.o 00:03:09.603 CC lib/scsi/scsi_bdev.o 00:03:09.603 CC lib/nvmf/nvmf.o 00:03:09.603 CC lib/scsi/scsi_pr.o 00:03:09.603 CC lib/ftl/ftl_sb.o 00:03:09.603 CC lib/nvmf/nvmf_rpc.o 00:03:09.603 CC lib/scsi/scsi_rpc.o 00:03:09.603 CC lib/ftl/ftl_l2p.o 00:03:09.603 CC lib/nvmf/transport.o 00:03:09.603 CC lib/nvmf/tcp.o 00:03:09.603 CC lib/scsi/task.o 00:03:09.603 CC lib/ftl/ftl_l2p_flat.o 00:03:09.603 CC lib/nvmf/stubs.o 00:03:09.603 CC lib/ftl/ftl_nv_cache.o 00:03:09.603 CC lib/ftl/ftl_band.o 00:03:09.603 CC lib/nvmf/mdns_server.o 00:03:09.603 CC lib/ftl/ftl_band_ops.o 00:03:09.603 CC lib/nvmf/rdma.o 00:03:09.603 CC lib/nvmf/auth.o 00:03:09.603 CC lib/ftl/ftl_writer.o 00:03:09.603 CC lib/ftl/ftl_rq.o 00:03:09.603 CC lib/ftl/ftl_reloc.o 00:03:09.603 CC lib/ftl/ftl_l2p_cache.o 00:03:09.603 CC lib/ftl/ftl_p2l.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_startup.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_md.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_misc.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_band.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:03:09.603 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:03:09.603 CC lib/ftl/utils/ftl_conf.o 00:03:09.603 CC lib/ftl/utils/ftl_md.o 00:03:09.603 CC lib/ftl/utils/ftl_mempool.o 00:03:09.603 CC lib/ftl/utils/ftl_bitmap.o 00:03:09.603 CC lib/ftl/utils/ftl_property.o 00:03:09.603 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:03:09.604 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:03:09.604 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:03:09.604 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:03:09.604 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:03:09.604 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:03:09.604 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:03:09.604 CC lib/ftl/upgrade/ftl_sb_v3.o 00:03:09.604 CC lib/ftl/upgrade/ftl_sb_v5.o 00:03:09.604 CC lib/ftl/nvc/ftl_nvc_dev.o 00:03:09.604 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:03:09.604 CC lib/ftl/base/ftl_base_bdev.o 00:03:09.604 CC lib/ftl/base/ftl_base_dev.o 00:03:09.604 CC lib/ftl/ftl_trace.o 00:03:10.168 LIB libspdk_nbd.a 00:03:10.168 SO libspdk_nbd.so.7.0 00:03:10.168 SYMLINK libspdk_nbd.so 00:03:10.168 LIB libspdk_scsi.a 00:03:10.168 SO libspdk_scsi.so.9.0 00:03:10.426 LIB libspdk_ublk.a 00:03:10.426 SO libspdk_ublk.so.3.0 00:03:10.426 SYMLINK libspdk_scsi.so 00:03:10.426 SYMLINK libspdk_ublk.so 00:03:10.685 LIB libspdk_ftl.a 00:03:10.685 CC lib/vhost/vhost_rpc.o 00:03:10.685 CC lib/vhost/vhost.o 00:03:10.685 CC lib/iscsi/conn.o 00:03:10.685 CC lib/iscsi/init_grp.o 00:03:10.685 CC lib/iscsi/iscsi.o 00:03:10.685 CC lib/vhost/vhost_scsi.o 00:03:10.685 CC lib/vhost/vhost_blk.o 00:03:10.685 CC lib/iscsi/md5.o 00:03:10.685 CC lib/vhost/rte_vhost_user.o 00:03:10.685 CC lib/iscsi/param.o 00:03:10.685 CC lib/iscsi/portal_grp.o 00:03:10.685 CC lib/iscsi/tgt_node.o 00:03:10.685 CC lib/iscsi/iscsi_subsystem.o 00:03:10.685 CC lib/iscsi/iscsi_rpc.o 00:03:10.685 CC lib/iscsi/task.o 00:03:10.943 SO libspdk_ftl.so.9.0 00:03:11.217 SYMLINK libspdk_ftl.so 00:03:11.865 LIB libspdk_nvmf.a 00:03:11.865 LIB libspdk_iscsi.a 00:03:11.865 LIB libspdk_vhost.a 00:03:11.865 SO libspdk_nvmf.so.18.1 00:03:11.865 SO libspdk_vhost.so.8.0 00:03:11.865 SO libspdk_iscsi.so.8.0 00:03:12.125 SYMLINK libspdk_vhost.so 00:03:12.125 SYMLINK libspdk_nvmf.so 00:03:12.125 SYMLINK libspdk_iscsi.so 00:03:12.693 CC module/env_dpdk/env_dpdk_rpc.o 00:03:12.952 LIB libspdk_env_dpdk_rpc.a 00:03:12.952 CC module/scheduler/dynamic/scheduler_dynamic.o 00:03:12.952 CC module/accel/iaa/accel_iaa.o 00:03:12.952 CC module/accel/iaa/accel_iaa_rpc.o 00:03:12.952 CC module/keyring/linux/keyring.o 00:03:12.952 CC module/keyring/linux/keyring_rpc.o 00:03:12.952 CC module/accel/dsa/accel_dsa_rpc.o 00:03:12.952 CC module/accel/dsa/accel_dsa.o 00:03:12.952 CC module/accel/ioat/accel_ioat.o 00:03:12.952 CC module/accel/ioat/accel_ioat_rpc.o 00:03:12.952 CC module/keyring/file/keyring.o 00:03:12.952 CC module/blob/bdev/blob_bdev.o 00:03:12.952 CC module/keyring/file/keyring_rpc.o 00:03:12.952 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:03:12.952 CC module/scheduler/gscheduler/gscheduler.o 00:03:12.952 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:03:12.952 CC module/sock/posix/posix.o 00:03:12.952 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:03:12.952 CC module/accel/error/accel_error_rpc.o 00:03:12.952 CC module/accel/error/accel_error.o 00:03:12.952 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:03:12.952 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:03:12.952 SO libspdk_env_dpdk_rpc.so.6.0 00:03:12.952 SYMLINK libspdk_env_dpdk_rpc.so 00:03:13.212 LIB libspdk_keyring_linux.a 00:03:13.212 LIB libspdk_keyring_file.a 00:03:13.212 LIB libspdk_scheduler_gscheduler.a 00:03:13.212 LIB libspdk_scheduler_dynamic.a 00:03:13.212 SO libspdk_keyring_linux.so.1.0 00:03:13.212 LIB libspdk_scheduler_dpdk_governor.a 00:03:13.212 LIB libspdk_accel_iaa.a 00:03:13.212 SO libspdk_keyring_file.so.1.0 00:03:13.212 SO libspdk_scheduler_gscheduler.so.4.0 00:03:13.212 LIB libspdk_accel_ioat.a 00:03:13.212 LIB libspdk_accel_error.a 00:03:13.212 SO libspdk_scheduler_dynamic.so.4.0 00:03:13.212 SO libspdk_scheduler_dpdk_governor.so.4.0 00:03:13.212 SO libspdk_accel_iaa.so.3.0 00:03:13.212 SO libspdk_accel_ioat.so.6.0 00:03:13.212 SYMLINK libspdk_keyring_linux.so 00:03:13.212 SYMLINK libspdk_scheduler_gscheduler.so 00:03:13.212 LIB libspdk_accel_dsa.a 00:03:13.212 SO libspdk_accel_error.so.2.0 00:03:13.212 SYMLINK libspdk_keyring_file.so 00:03:13.212 SYMLINK libspdk_scheduler_dynamic.so 00:03:13.212 LIB libspdk_blob_bdev.a 00:03:13.212 SYMLINK libspdk_accel_iaa.so 00:03:13.212 SYMLINK libspdk_scheduler_dpdk_governor.so 00:03:13.212 SO libspdk_accel_dsa.so.5.0 00:03:13.212 SYMLINK libspdk_accel_ioat.so 00:03:13.212 SO libspdk_blob_bdev.so.11.0 00:03:13.212 SYMLINK libspdk_accel_error.so 00:03:13.471 SYMLINK libspdk_blob_bdev.so 00:03:13.471 SYMLINK libspdk_accel_dsa.so 00:03:13.471 LIB libspdk_sock_posix.a 00:03:13.730 SO libspdk_sock_posix.so.6.0 00:03:13.730 SYMLINK libspdk_sock_posix.so 00:03:13.988 CC module/bdev/zone_block/vbdev_zone_block.o 00:03:13.988 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:03:13.988 CC module/bdev/error/vbdev_error.o 00:03:13.988 CC module/bdev/error/vbdev_error_rpc.o 00:03:13.988 CC module/bdev/ftl/bdev_ftl_rpc.o 00:03:13.988 CC module/bdev/ftl/bdev_ftl.o 00:03:13.988 CC module/bdev/delay/vbdev_delay.o 00:03:13.988 CC module/bdev/delay/vbdev_delay_rpc.o 00:03:13.988 CC module/bdev/passthru/vbdev_passthru.o 00:03:13.988 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:03:13.988 CC module/bdev/gpt/vbdev_gpt.o 00:03:13.988 CC module/bdev/gpt/gpt.o 00:03:13.988 CC module/bdev/nvme/bdev_nvme.o 00:03:13.988 CC module/bdev/nvme/nvme_rpc.o 00:03:13.988 CC module/bdev/nvme/bdev_nvme_rpc.o 00:03:13.988 CC module/bdev/malloc/bdev_malloc.o 00:03:13.988 CC module/bdev/nvme/bdev_mdns_client.o 00:03:13.988 CC module/bdev/nvme/vbdev_opal.o 00:03:13.988 CC module/bdev/malloc/bdev_malloc_rpc.o 00:03:13.988 CC module/bdev/nvme/vbdev_opal_rpc.o 00:03:13.988 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:03:13.988 CC module/bdev/split/vbdev_split_rpc.o 00:03:13.988 CC module/bdev/null/bdev_null_rpc.o 00:03:13.988 CC module/bdev/null/bdev_null.o 00:03:13.988 CC module/bdev/split/vbdev_split.o 00:03:13.988 CC module/bdev/lvol/vbdev_lvol.o 00:03:13.988 CC module/bdev/crypto/vbdev_crypto.o 00:03:13.988 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:03:13.988 CC module/bdev/aio/bdev_aio_rpc.o 00:03:13.988 CC module/blobfs/bdev/blobfs_bdev.o 00:03:13.988 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:03:13.988 CC module/bdev/aio/bdev_aio.o 00:03:13.988 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:03:13.988 CC module/bdev/compress/vbdev_compress.o 00:03:13.988 CC module/bdev/compress/vbdev_compress_rpc.o 00:03:13.988 CC module/bdev/virtio/bdev_virtio_blk.o 00:03:13.988 CC module/bdev/virtio/bdev_virtio_scsi.o 00:03:13.988 CC module/bdev/virtio/bdev_virtio_rpc.o 00:03:13.988 CC module/bdev/iscsi/bdev_iscsi.o 00:03:13.988 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:03:13.988 CC module/bdev/raid/bdev_raid.o 00:03:13.988 CC module/bdev/raid/bdev_raid_rpc.o 00:03:13.988 CC module/bdev/raid/bdev_raid_sb.o 00:03:13.988 CC module/bdev/raid/raid0.o 00:03:13.988 CC module/bdev/raid/concat.o 00:03:13.988 CC module/bdev/raid/raid1.o 00:03:13.988 LIB libspdk_accel_dpdk_compressdev.a 00:03:13.988 SO libspdk_accel_dpdk_compressdev.so.3.0 00:03:14.248 SYMLINK libspdk_accel_dpdk_compressdev.so 00:03:14.248 LIB libspdk_bdev_split.a 00:03:14.248 LIB libspdk_bdev_ftl.a 00:03:14.248 LIB libspdk_bdev_null.a 00:03:14.248 LIB libspdk_bdev_passthru.a 00:03:14.248 SO libspdk_bdev_split.so.6.0 00:03:14.248 LIB libspdk_blobfs_bdev.a 00:03:14.248 LIB libspdk_bdev_malloc.a 00:03:14.248 SO libspdk_bdev_ftl.so.6.0 00:03:14.248 SO libspdk_bdev_null.so.6.0 00:03:14.248 SO libspdk_bdev_passthru.so.6.0 00:03:14.248 SO libspdk_blobfs_bdev.so.6.0 00:03:14.248 SO libspdk_bdev_malloc.so.6.0 00:03:14.248 LIB libspdk_bdev_crypto.a 00:03:14.248 LIB libspdk_bdev_aio.a 00:03:14.248 SYMLINK libspdk_bdev_split.so 00:03:14.248 LIB libspdk_bdev_delay.a 00:03:14.248 SYMLINK libspdk_bdev_ftl.so 00:03:14.248 SO libspdk_bdev_crypto.so.6.0 00:03:14.248 LIB libspdk_bdev_iscsi.a 00:03:14.248 SO libspdk_bdev_aio.so.6.0 00:03:14.248 SYMLINK libspdk_bdev_null.so 00:03:14.248 LIB libspdk_bdev_zone_block.a 00:03:14.248 LIB libspdk_accel_dpdk_cryptodev.a 00:03:14.248 SYMLINK libspdk_bdev_passthru.so 00:03:14.248 SYMLINK libspdk_blobfs_bdev.so 00:03:14.508 SO libspdk_bdev_delay.so.6.0 00:03:14.508 SYMLINK libspdk_bdev_malloc.so 00:03:14.508 LIB libspdk_bdev_gpt.a 00:03:14.508 LIB libspdk_bdev_lvol.a 00:03:14.508 SO libspdk_bdev_iscsi.so.6.0 00:03:14.508 SO libspdk_bdev_zone_block.so.6.0 00:03:14.508 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:03:14.508 SYMLINK libspdk_bdev_aio.so 00:03:14.508 SO libspdk_bdev_gpt.so.6.0 00:03:14.508 SYMLINK libspdk_bdev_crypto.so 00:03:14.508 SO libspdk_bdev_lvol.so.6.0 00:03:14.508 SYMLINK libspdk_bdev_delay.so 00:03:14.508 LIB libspdk_bdev_error.a 00:03:14.508 SYMLINK libspdk_bdev_iscsi.so 00:03:14.508 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:03:14.508 SYMLINK libspdk_bdev_zone_block.so 00:03:14.508 SYMLINK libspdk_bdev_gpt.so 00:03:14.508 SO libspdk_bdev_error.so.6.0 00:03:14.508 SYMLINK libspdk_bdev_lvol.so 00:03:14.508 LIB libspdk_bdev_compress.a 00:03:14.508 LIB libspdk_bdev_virtio.a 00:03:14.508 SO libspdk_bdev_compress.so.6.0 00:03:14.508 SYMLINK libspdk_bdev_error.so 00:03:14.508 SO libspdk_bdev_virtio.so.6.0 00:03:14.768 SYMLINK libspdk_bdev_compress.so 00:03:14.768 SYMLINK libspdk_bdev_virtio.so 00:03:15.027 LIB libspdk_bdev_raid.a 00:03:15.027 SO libspdk_bdev_raid.so.6.0 00:03:15.286 SYMLINK libspdk_bdev_raid.so 00:03:16.225 LIB libspdk_bdev_nvme.a 00:03:16.225 SO libspdk_bdev_nvme.so.7.0 00:03:16.485 SYMLINK libspdk_bdev_nvme.so 00:03:17.053 CC module/event/subsystems/iobuf/iobuf.o 00:03:17.053 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:03:17.053 CC module/event/subsystems/vmd/vmd.o 00:03:17.053 CC module/event/subsystems/vmd/vmd_rpc.o 00:03:17.053 CC module/event/subsystems/sock/sock.o 00:03:17.053 CC module/event/subsystems/keyring/keyring.o 00:03:17.312 CC module/event/subsystems/scheduler/scheduler.o 00:03:17.312 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:03:17.312 LIB libspdk_event_sock.a 00:03:17.312 LIB libspdk_event_vmd.a 00:03:17.312 LIB libspdk_event_keyring.a 00:03:17.312 LIB libspdk_event_scheduler.a 00:03:17.312 LIB libspdk_event_vhost_blk.a 00:03:17.312 LIB libspdk_event_iobuf.a 00:03:17.312 SO libspdk_event_vmd.so.6.0 00:03:17.312 SO libspdk_event_sock.so.5.0 00:03:17.312 SO libspdk_event_scheduler.so.4.0 00:03:17.312 SO libspdk_event_keyring.so.1.0 00:03:17.312 SO libspdk_event_vhost_blk.so.3.0 00:03:17.312 SO libspdk_event_iobuf.so.3.0 00:03:17.571 SYMLINK libspdk_event_sock.so 00:03:17.571 SYMLINK libspdk_event_vmd.so 00:03:17.571 SYMLINK libspdk_event_keyring.so 00:03:17.571 SYMLINK libspdk_event_scheduler.so 00:03:17.571 SYMLINK libspdk_event_vhost_blk.so 00:03:17.571 SYMLINK libspdk_event_iobuf.so 00:03:17.831 CC module/event/subsystems/accel/accel.o 00:03:18.090 LIB libspdk_event_accel.a 00:03:18.090 SO libspdk_event_accel.so.6.0 00:03:18.090 SYMLINK libspdk_event_accel.so 00:03:18.660 CC module/event/subsystems/bdev/bdev.o 00:03:18.660 LIB libspdk_event_bdev.a 00:03:18.660 SO libspdk_event_bdev.so.6.0 00:03:18.919 SYMLINK libspdk_event_bdev.so 00:03:19.178 CC module/event/subsystems/scsi/scsi.o 00:03:19.178 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:03:19.178 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:03:19.178 CC module/event/subsystems/ublk/ublk.o 00:03:19.178 CC module/event/subsystems/nbd/nbd.o 00:03:19.437 LIB libspdk_event_nbd.a 00:03:19.437 LIB libspdk_event_ublk.a 00:03:19.437 LIB libspdk_event_scsi.a 00:03:19.437 SO libspdk_event_nbd.so.6.0 00:03:19.438 SO libspdk_event_ublk.so.3.0 00:03:19.438 SO libspdk_event_scsi.so.6.0 00:03:19.438 SYMLINK libspdk_event_ublk.so 00:03:19.438 SYMLINK libspdk_event_nbd.so 00:03:19.438 SYMLINK libspdk_event_scsi.so 00:03:19.697 LIB libspdk_event_nvmf.a 00:03:19.697 SO libspdk_event_nvmf.so.6.0 00:03:19.697 SYMLINK libspdk_event_nvmf.so 00:03:19.955 CC module/event/subsystems/iscsi/iscsi.o 00:03:19.955 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:19.955 LIB libspdk_event_vhost_scsi.a 00:03:20.214 LIB libspdk_event_iscsi.a 00:03:20.214 SO libspdk_event_vhost_scsi.so.3.0 00:03:20.214 SO libspdk_event_iscsi.so.6.0 00:03:20.214 SYMLINK libspdk_event_vhost_scsi.so 00:03:20.214 SYMLINK libspdk_event_iscsi.so 00:03:20.474 SO libspdk.so.6.0 00:03:20.474 SYMLINK libspdk.so 00:03:20.733 CC app/trace_record/trace_record.o 00:03:20.733 CXX app/trace/trace.o 00:03:20.733 CC app/spdk_lspci/spdk_lspci.o 00:03:20.733 CC app/spdk_nvme_perf/perf.o 00:03:20.733 CC app/spdk_nvme_identify/identify.o 00:03:20.733 CC app/spdk_nvme_discover/discovery_aer.o 00:03:20.733 TEST_HEADER include/spdk/accel.h 00:03:20.733 CC app/spdk_top/spdk_top.o 00:03:20.733 TEST_HEADER include/spdk/accel_module.h 00:03:20.733 TEST_HEADER include/spdk/base64.h 00:03:20.733 TEST_HEADER include/spdk/assert.h 00:03:20.733 TEST_HEADER include/spdk/barrier.h 00:03:20.733 TEST_HEADER include/spdk/bdev.h 00:03:20.733 TEST_HEADER include/spdk/bdev_zone.h 00:03:20.733 CC test/rpc_client/rpc_client_test.o 00:03:20.733 TEST_HEADER include/spdk/bit_array.h 00:03:20.733 TEST_HEADER include/spdk/bdev_module.h 00:03:20.733 TEST_HEADER include/spdk/bit_pool.h 00:03:20.733 TEST_HEADER include/spdk/blob_bdev.h 00:03:20.733 TEST_HEADER include/spdk/blobfs.h 00:03:20.733 TEST_HEADER include/spdk/blob.h 00:03:20.733 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:20.733 TEST_HEADER include/spdk/conf.h 00:03:20.733 TEST_HEADER include/spdk/config.h 00:03:20.733 TEST_HEADER include/spdk/cpuset.h 00:03:20.733 TEST_HEADER include/spdk/crc16.h 00:03:20.733 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:20.733 TEST_HEADER include/spdk/crc32.h 00:03:20.733 TEST_HEADER include/spdk/crc64.h 00:03:20.733 TEST_HEADER include/spdk/dif.h 00:03:20.733 TEST_HEADER include/spdk/dma.h 00:03:20.733 TEST_HEADER include/spdk/endian.h 00:03:20.733 TEST_HEADER include/spdk/env_dpdk.h 00:03:20.733 TEST_HEADER include/spdk/env.h 00:03:20.733 TEST_HEADER include/spdk/event.h 00:03:20.733 TEST_HEADER include/spdk/fd.h 00:03:20.733 TEST_HEADER include/spdk/file.h 00:03:20.733 TEST_HEADER include/spdk/fd_group.h 00:03:20.733 TEST_HEADER include/spdk/ftl.h 00:03:20.733 TEST_HEADER include/spdk/gpt_spec.h 00:03:20.733 TEST_HEADER include/spdk/idxd.h 00:03:20.733 TEST_HEADER include/spdk/histogram_data.h 00:03:20.733 TEST_HEADER include/spdk/hexlify.h 00:03:20.733 TEST_HEADER include/spdk/idxd_spec.h 00:03:20.733 TEST_HEADER include/spdk/init.h 00:03:20.733 TEST_HEADER include/spdk/ioat.h 00:03:20.733 TEST_HEADER include/spdk/ioat_spec.h 00:03:20.733 TEST_HEADER include/spdk/iscsi_spec.h 00:03:20.733 TEST_HEADER include/spdk/json.h 00:03:20.733 TEST_HEADER include/spdk/jsonrpc.h 00:03:20.733 TEST_HEADER include/spdk/keyring_module.h 00:03:20.733 TEST_HEADER include/spdk/keyring.h 00:03:20.733 TEST_HEADER include/spdk/likely.h 00:03:20.733 TEST_HEADER include/spdk/lvol.h 00:03:20.733 TEST_HEADER include/spdk/log.h 00:03:20.733 CC app/iscsi_tgt/iscsi_tgt.o 00:03:20.733 TEST_HEADER include/spdk/memory.h 00:03:21.007 TEST_HEADER include/spdk/mmio.h 00:03:21.007 TEST_HEADER include/spdk/notify.h 00:03:21.007 TEST_HEADER include/spdk/nbd.h 00:03:21.007 TEST_HEADER include/spdk/nvme_intel.h 00:03:21.007 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:21.007 TEST_HEADER include/spdk/nvme.h 00:03:21.007 TEST_HEADER include/spdk/nvme_spec.h 00:03:21.007 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:21.007 TEST_HEADER include/spdk/nvme_zns.h 00:03:21.007 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:21.007 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:21.007 TEST_HEADER include/spdk/nvmf.h 00:03:21.007 TEST_HEADER include/spdk/nvmf_spec.h 00:03:21.007 TEST_HEADER include/spdk/nvmf_transport.h 00:03:21.007 TEST_HEADER include/spdk/opal.h 00:03:21.007 TEST_HEADER include/spdk/opal_spec.h 00:03:21.007 TEST_HEADER include/spdk/pci_ids.h 00:03:21.007 TEST_HEADER include/spdk/pipe.h 00:03:21.007 TEST_HEADER include/spdk/queue.h 00:03:21.007 TEST_HEADER include/spdk/reduce.h 00:03:21.007 TEST_HEADER include/spdk/rpc.h 00:03:21.007 TEST_HEADER include/spdk/scheduler.h 00:03:21.007 TEST_HEADER include/spdk/scsi.h 00:03:21.007 TEST_HEADER include/spdk/scsi_spec.h 00:03:21.007 TEST_HEADER include/spdk/sock.h 00:03:21.007 TEST_HEADER include/spdk/stdinc.h 00:03:21.007 TEST_HEADER include/spdk/string.h 00:03:21.007 TEST_HEADER include/spdk/thread.h 00:03:21.007 TEST_HEADER include/spdk/trace.h 00:03:21.007 TEST_HEADER include/spdk/trace_parser.h 00:03:21.007 TEST_HEADER include/spdk/tree.h 00:03:21.007 TEST_HEADER include/spdk/util.h 00:03:21.007 TEST_HEADER include/spdk/ublk.h 00:03:21.007 TEST_HEADER include/spdk/uuid.h 00:03:21.007 TEST_HEADER include/spdk/version.h 00:03:21.007 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:21.007 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:21.007 TEST_HEADER include/spdk/vhost.h 00:03:21.007 TEST_HEADER include/spdk/vmd.h 00:03:21.007 TEST_HEADER include/spdk/xor.h 00:03:21.007 TEST_HEADER include/spdk/zipf.h 00:03:21.007 CXX test/cpp_headers/accel.o 00:03:21.007 CXX test/cpp_headers/accel_module.o 00:03:21.007 CXX test/cpp_headers/barrier.o 00:03:21.007 CXX test/cpp_headers/assert.o 00:03:21.007 CXX test/cpp_headers/base64.o 00:03:21.007 CXX test/cpp_headers/bdev.o 00:03:21.007 CXX test/cpp_headers/bdev_module.o 00:03:21.007 CXX test/cpp_headers/bit_array.o 00:03:21.007 CXX test/cpp_headers/bit_pool.o 00:03:21.007 CXX test/cpp_headers/bdev_zone.o 00:03:21.007 CC app/spdk_tgt/spdk_tgt.o 00:03:21.007 CXX test/cpp_headers/blob_bdev.o 00:03:21.007 CXX test/cpp_headers/blobfs_bdev.o 00:03:21.007 CXX test/cpp_headers/blobfs.o 00:03:21.007 CXX test/cpp_headers/conf.o 00:03:21.007 CXX test/cpp_headers/config.o 00:03:21.007 CXX test/cpp_headers/blob.o 00:03:21.007 CC app/spdk_dd/spdk_dd.o 00:03:21.007 CXX test/cpp_headers/cpuset.o 00:03:21.007 CXX test/cpp_headers/crc32.o 00:03:21.007 CXX test/cpp_headers/crc64.o 00:03:21.007 CXX test/cpp_headers/crc16.o 00:03:21.007 CXX test/cpp_headers/dif.o 00:03:21.007 CXX test/cpp_headers/dma.o 00:03:21.007 CXX test/cpp_headers/endian.o 00:03:21.007 CXX test/cpp_headers/env_dpdk.o 00:03:21.007 CXX test/cpp_headers/env.o 00:03:21.007 CXX test/cpp_headers/fd_group.o 00:03:21.007 CXX test/cpp_headers/event.o 00:03:21.007 CXX test/cpp_headers/fd.o 00:03:21.007 CXX test/cpp_headers/file.o 00:03:21.007 CXX test/cpp_headers/gpt_spec.o 00:03:21.007 CXX test/cpp_headers/ftl.o 00:03:21.007 CXX test/cpp_headers/hexlify.o 00:03:21.007 CC examples/ioat/verify/verify.o 00:03:21.007 CXX test/cpp_headers/histogram_data.o 00:03:21.007 CXX test/cpp_headers/idxd_spec.o 00:03:21.007 CXX test/cpp_headers/idxd.o 00:03:21.007 CXX test/cpp_headers/init.o 00:03:21.007 CXX test/cpp_headers/ioat_spec.o 00:03:21.007 CXX test/cpp_headers/ioat.o 00:03:21.007 CXX test/cpp_headers/iscsi_spec.o 00:03:21.007 CXX test/cpp_headers/json.o 00:03:21.007 CXX test/cpp_headers/jsonrpc.o 00:03:21.007 CC examples/ioat/perf/perf.o 00:03:21.007 CXX test/cpp_headers/keyring.o 00:03:21.007 CC app/nvmf_tgt/nvmf_main.o 00:03:21.007 CC examples/util/zipf/zipf.o 00:03:21.007 CXX test/cpp_headers/keyring_module.o 00:03:21.007 CC test/env/vtophys/vtophys.o 00:03:21.007 CC test/env/pci/pci_ut.o 00:03:21.007 CC test/thread/poller_perf/poller_perf.o 00:03:21.007 CC test/app/histogram_perf/histogram_perf.o 00:03:21.007 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:21.007 CC test/env/memory/memory_ut.o 00:03:21.007 CC test/app/stub/stub.o 00:03:21.007 CC app/fio/nvme/fio_plugin.o 00:03:21.007 CC test/app/jsoncat/jsoncat.o 00:03:21.007 LINK spdk_lspci 00:03:21.007 CC app/fio/bdev/fio_plugin.o 00:03:21.007 CC test/dma/test_dma/test_dma.o 00:03:21.007 CC test/app/bdev_svc/bdev_svc.o 00:03:21.268 LINK spdk_nvme_discover 00:03:21.268 CC test/env/mem_callbacks/mem_callbacks.o 00:03:21.268 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:21.268 LINK histogram_perf 00:03:21.268 LINK rpc_client_test 00:03:21.268 LINK vtophys 00:03:21.268 LINK poller_perf 00:03:21.533 LINK env_dpdk_post_init 00:03:21.533 LINK interrupt_tgt 00:03:21.533 CXX test/cpp_headers/likely.o 00:03:21.533 LINK spdk_trace_record 00:03:21.533 CXX test/cpp_headers/log.o 00:03:21.533 LINK spdk_tgt 00:03:21.533 CXX test/cpp_headers/lvol.o 00:03:21.533 LINK jsoncat 00:03:21.533 CXX test/cpp_headers/memory.o 00:03:21.533 LINK verify 00:03:21.533 CXX test/cpp_headers/nbd.o 00:03:21.533 CXX test/cpp_headers/mmio.o 00:03:21.533 LINK stub 00:03:21.533 LINK nvmf_tgt 00:03:21.533 CXX test/cpp_headers/notify.o 00:03:21.533 CXX test/cpp_headers/nvme.o 00:03:21.533 CXX test/cpp_headers/nvme_intel.o 00:03:21.533 CXX test/cpp_headers/nvme_ocssd.o 00:03:21.533 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:21.533 CXX test/cpp_headers/nvme_spec.o 00:03:21.533 CXX test/cpp_headers/nvme_zns.o 00:03:21.533 CXX test/cpp_headers/nvmf_cmd.o 00:03:21.533 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:21.533 CXX test/cpp_headers/nvmf.o 00:03:21.533 CXX test/cpp_headers/nvmf_spec.o 00:03:21.533 CXX test/cpp_headers/nvmf_transport.o 00:03:21.533 CXX test/cpp_headers/opal.o 00:03:21.533 LINK iscsi_tgt 00:03:21.533 CXX test/cpp_headers/opal_spec.o 00:03:21.533 CXX test/cpp_headers/pci_ids.o 00:03:21.533 CXX test/cpp_headers/pipe.o 00:03:21.533 CXX test/cpp_headers/queue.o 00:03:21.533 CXX test/cpp_headers/reduce.o 00:03:21.533 CXX test/cpp_headers/rpc.o 00:03:21.533 CXX test/cpp_headers/scheduler.o 00:03:21.533 CXX test/cpp_headers/scsi.o 00:03:21.533 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:21.533 CXX test/cpp_headers/sock.o 00:03:21.533 CXX test/cpp_headers/scsi_spec.o 00:03:21.533 LINK ioat_perf 00:03:21.792 CXX test/cpp_headers/stdinc.o 00:03:21.792 CXX test/cpp_headers/string.o 00:03:21.792 CXX test/cpp_headers/thread.o 00:03:21.792 CXX test/cpp_headers/trace.o 00:03:21.792 CXX test/cpp_headers/tree.o 00:03:21.792 CXX test/cpp_headers/trace_parser.o 00:03:21.792 CXX test/cpp_headers/util.o 00:03:21.792 CXX test/cpp_headers/ublk.o 00:03:21.792 CXX test/cpp_headers/uuid.o 00:03:21.792 CXX test/cpp_headers/version.o 00:03:21.792 CXX test/cpp_headers/vfio_user_pci.o 00:03:21.792 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:21.792 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:21.792 LINK zipf 00:03:21.792 CXX test/cpp_headers/vfio_user_spec.o 00:03:21.792 CXX test/cpp_headers/vhost.o 00:03:21.792 CXX test/cpp_headers/vmd.o 00:03:21.792 CXX test/cpp_headers/xor.o 00:03:21.792 CXX test/cpp_headers/zipf.o 00:03:21.792 LINK spdk_trace 00:03:21.792 LINK spdk_dd 00:03:21.792 LINK pci_ut 00:03:21.792 LINK bdev_svc 00:03:22.050 CC test/event/reactor_perf/reactor_perf.o 00:03:22.051 CC test/event/event_perf/event_perf.o 00:03:22.051 CC test/event/reactor/reactor.o 00:03:22.051 CC test/event/app_repeat/app_repeat.o 00:03:22.051 LINK test_dma 00:03:22.310 LINK spdk_nvme_perf 00:03:22.310 LINK nvme_fuzz 00:03:22.310 CC test/event/scheduler/scheduler.o 00:03:22.310 LINK spdk_nvme_identify 00:03:22.310 LINK spdk_bdev 00:03:22.310 LINK spdk_nvme 00:03:22.310 LINK mem_callbacks 00:03:22.310 LINK reactor_perf 00:03:22.310 LINK reactor 00:03:22.310 LINK event_perf 00:03:22.310 LINK app_repeat 00:03:22.310 CC examples/vmd/lsvmd/lsvmd.o 00:03:22.310 CC examples/idxd/perf/perf.o 00:03:22.310 CC examples/sock/hello_world/hello_sock.o 00:03:22.310 CC examples/vmd/led/led.o 00:03:22.310 CC app/vhost/vhost.o 00:03:22.310 LINK vhost_fuzz 00:03:22.310 CC examples/thread/thread/thread_ex.o 00:03:22.568 LINK lsvmd 00:03:22.568 LINK led 00:03:22.568 LINK vhost 00:03:22.568 LINK scheduler 00:03:22.568 LINK hello_sock 00:03:22.568 CC test/nvme/boot_partition/boot_partition.o 00:03:22.568 CC test/nvme/overhead/overhead.o 00:03:22.568 CC test/nvme/startup/startup.o 00:03:22.568 CC test/nvme/reset/reset.o 00:03:22.568 CC test/nvme/cuse/cuse.o 00:03:22.568 CC test/nvme/compliance/nvme_compliance.o 00:03:22.568 CC test/nvme/sgl/sgl.o 00:03:22.568 CC test/nvme/fused_ordering/fused_ordering.o 00:03:22.568 CC test/nvme/simple_copy/simple_copy.o 00:03:22.568 CC test/nvme/connect_stress/connect_stress.o 00:03:22.568 CC test/nvme/aer/aer.o 00:03:22.568 CC test/nvme/err_injection/err_injection.o 00:03:22.568 CC test/nvme/e2edp/nvme_dp.o 00:03:22.568 CC test/nvme/fdp/fdp.o 00:03:22.827 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:22.827 CC test/nvme/reserve/reserve.o 00:03:22.827 CC test/blobfs/mkfs/mkfs.o 00:03:22.827 CC test/accel/dif/dif.o 00:03:22.827 LINK thread 00:03:22.827 LINK idxd_perf 00:03:22.827 CC test/lvol/esnap/esnap.o 00:03:22.827 LINK memory_ut 00:03:22.827 LINK boot_partition 00:03:22.827 LINK err_injection 00:03:22.827 LINK connect_stress 00:03:22.827 LINK startup 00:03:22.827 LINK doorbell_aers 00:03:22.827 LINK fused_ordering 00:03:22.827 LINK reserve 00:03:23.086 LINK simple_copy 00:03:23.086 LINK spdk_top 00:03:23.086 LINK reset 00:03:23.086 LINK mkfs 00:03:23.086 LINK sgl 00:03:23.086 LINK nvme_dp 00:03:23.086 LINK aer 00:03:23.086 LINK nvme_compliance 00:03:23.086 LINK fdp 00:03:23.086 CC examples/nvme/hello_world/hello_world.o 00:03:23.086 CC examples/nvme/abort/abort.o 00:03:23.086 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:23.086 CC examples/nvme/reconnect/reconnect.o 00:03:23.086 CC examples/nvme/arbitration/arbitration.o 00:03:23.086 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:23.086 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:23.086 CC examples/nvme/hotplug/hotplug.o 00:03:23.345 CC examples/accel/perf/accel_perf.o 00:03:23.345 LINK iscsi_fuzz 00:03:23.345 LINK pmr_persistence 00:03:23.345 CC examples/blob/cli/blobcli.o 00:03:23.345 CC examples/blob/hello_world/hello_blob.o 00:03:23.345 LINK overhead 00:03:23.345 LINK cmb_copy 00:03:23.345 LINK hello_world 00:03:23.345 LINK hotplug 00:03:23.604 LINK arbitration 00:03:23.604 LINK reconnect 00:03:23.604 LINK abort 00:03:23.604 LINK nvme_manage 00:03:23.604 LINK hello_blob 00:03:23.604 LINK dif 00:03:23.863 LINK cuse 00:03:23.863 LINK accel_perf 00:03:23.863 LINK blobcli 00:03:24.432 CC examples/bdev/hello_world/hello_bdev.o 00:03:24.432 CC examples/bdev/bdevperf/bdevperf.o 00:03:24.691 CC test/bdev/bdevio/bdevio.o 00:03:24.691 LINK hello_bdev 00:03:24.951 LINK bdevio 00:03:25.210 LINK bdevperf 00:03:26.146 CC examples/nvmf/nvmf/nvmf.o 00:03:26.146 LINK nvmf 00:03:28.045 LINK esnap 00:03:28.303 00:03:28.303 real 1m36.374s 00:03:28.303 user 18m2.761s 00:03:28.303 sys 4m28.791s 00:03:28.303 11:44:41 make -- common/autotest_common.sh@1124 -- $ xtrace_disable 00:03:28.303 11:44:41 make -- common/autotest_common.sh@10 -- $ set +x 00:03:28.303 ************************************ 00:03:28.303 END TEST make 00:03:28.303 ************************************ 00:03:28.303 11:44:41 -- common/autotest_common.sh@1142 -- $ return 0 00:03:28.303 11:44:41 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:28.303 11:44:41 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:28.303 11:44:41 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:28.303 11:44:41 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:28.303 11:44:41 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:28.303 11:44:41 -- pm/common@44 -- $ pid=1291190 00:03:28.303 11:44:41 -- pm/common@50 -- $ kill -TERM 1291190 00:03:28.303 11:44:41 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:28.303 11:44:41 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:28.303 11:44:41 -- pm/common@44 -- $ pid=1291192 00:03:28.303 11:44:41 -- pm/common@50 -- $ kill -TERM 1291192 00:03:28.303 11:44:41 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:28.303 11:44:41 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:28.303 11:44:41 -- pm/common@44 -- $ pid=1291194 00:03:28.303 11:44:41 -- pm/common@50 -- $ kill -TERM 1291194 00:03:28.303 11:44:41 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:28.303 11:44:41 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:28.303 11:44:41 -- pm/common@44 -- $ pid=1291216 00:03:28.303 11:44:41 -- pm/common@50 -- $ sudo -E kill -TERM 1291216 00:03:28.562 11:44:41 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:28.562 11:44:41 -- nvmf/common.sh@7 -- # uname -s 00:03:28.562 11:44:41 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:28.562 11:44:41 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:28.562 11:44:41 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:28.562 11:44:41 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:28.562 11:44:41 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:28.562 11:44:41 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:28.562 11:44:41 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:28.562 11:44:41 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:28.562 11:44:41 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:28.562 11:44:41 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:28.562 11:44:41 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:005d867c-174e-e711-906e-0012795d9712 00:03:28.562 11:44:41 -- nvmf/common.sh@18 -- # NVME_HOSTID=005d867c-174e-e711-906e-0012795d9712 00:03:28.562 11:44:41 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:28.562 11:44:41 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:28.562 11:44:41 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:28.562 11:44:41 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:28.562 11:44:41 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:28.562 11:44:41 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:28.562 11:44:41 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:28.562 11:44:41 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:28.562 11:44:41 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:28.562 11:44:41 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:28.562 11:44:41 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:28.562 11:44:41 -- paths/export.sh@5 -- # export PATH 00:03:28.562 11:44:41 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:28.562 11:44:41 -- nvmf/common.sh@47 -- # : 0 00:03:28.562 11:44:41 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:28.562 11:44:41 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:28.562 11:44:41 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:28.562 11:44:41 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:28.562 11:44:41 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:28.562 11:44:41 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:28.562 11:44:41 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:28.562 11:44:41 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:28.562 11:44:41 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:28.562 11:44:41 -- spdk/autotest.sh@32 -- # uname -s 00:03:28.562 11:44:41 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:28.562 11:44:41 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:28.562 11:44:41 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:28.562 11:44:41 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:28.562 11:44:41 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:28.562 11:44:41 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:28.562 11:44:42 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:28.562 11:44:42 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:28.562 11:44:42 -- spdk/autotest.sh@48 -- # udevadm_pid=1358262 00:03:28.562 11:44:42 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:28.562 11:44:42 -- pm/common@17 -- # local monitor 00:03:28.562 11:44:42 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:28.562 11:44:42 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:28.562 11:44:42 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:28.562 11:44:42 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:28.562 11:44:42 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:28.562 11:44:42 -- pm/common@21 -- # date +%s 00:03:28.562 11:44:42 -- pm/common@25 -- # sleep 1 00:03:28.562 11:44:42 -- pm/common@21 -- # date +%s 00:03:28.562 11:44:42 -- pm/common@21 -- # date +%s 00:03:28.562 11:44:42 -- pm/common@21 -- # date +%s 00:03:28.562 11:44:42 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721036682 00:03:28.562 11:44:42 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721036682 00:03:28.562 11:44:42 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721036682 00:03:28.562 11:44:42 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721036682 00:03:28.562 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721036682_collect-vmstat.pm.log 00:03:28.562 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721036682_collect-cpu-load.pm.log 00:03:28.562 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721036682_collect-cpu-temp.pm.log 00:03:28.562 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721036682_collect-bmc-pm.bmc.pm.log 00:03:29.498 11:44:43 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:29.498 11:44:43 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:29.498 11:44:43 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:29.498 11:44:43 -- common/autotest_common.sh@10 -- # set +x 00:03:29.498 11:44:43 -- spdk/autotest.sh@59 -- # create_test_list 00:03:29.498 11:44:43 -- common/autotest_common.sh@746 -- # xtrace_disable 00:03:29.498 11:44:43 -- common/autotest_common.sh@10 -- # set +x 00:03:29.498 11:44:43 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:29.498 11:44:43 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:29.498 11:44:43 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:29.498 11:44:43 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:29.498 11:44:43 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:29.498 11:44:43 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:29.498 11:44:43 -- common/autotest_common.sh@1455 -- # uname 00:03:29.757 11:44:43 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:29.757 11:44:43 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:29.757 11:44:43 -- common/autotest_common.sh@1475 -- # uname 00:03:29.757 11:44:43 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:29.757 11:44:43 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:29.757 11:44:43 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:29.757 11:44:43 -- spdk/autotest.sh@72 -- # hash lcov 00:03:29.757 11:44:43 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:29.757 11:44:43 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:29.757 --rc lcov_branch_coverage=1 00:03:29.757 --rc lcov_function_coverage=1 00:03:29.757 --rc genhtml_branch_coverage=1 00:03:29.757 --rc genhtml_function_coverage=1 00:03:29.757 --rc genhtml_legend=1 00:03:29.757 --rc geninfo_all_blocks=1 00:03:29.757 ' 00:03:29.757 11:44:43 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:29.757 --rc lcov_branch_coverage=1 00:03:29.757 --rc lcov_function_coverage=1 00:03:29.757 --rc genhtml_branch_coverage=1 00:03:29.757 --rc genhtml_function_coverage=1 00:03:29.757 --rc genhtml_legend=1 00:03:29.757 --rc geninfo_all_blocks=1 00:03:29.757 ' 00:03:29.757 11:44:43 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:29.757 --rc lcov_branch_coverage=1 00:03:29.757 --rc lcov_function_coverage=1 00:03:29.757 --rc genhtml_branch_coverage=1 00:03:29.757 --rc genhtml_function_coverage=1 00:03:29.757 --rc genhtml_legend=1 00:03:29.757 --rc geninfo_all_blocks=1 00:03:29.757 --no-external' 00:03:29.757 11:44:43 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:29.757 --rc lcov_branch_coverage=1 00:03:29.757 --rc lcov_function_coverage=1 00:03:29.757 --rc genhtml_branch_coverage=1 00:03:29.757 --rc genhtml_function_coverage=1 00:03:29.757 --rc genhtml_legend=1 00:03:29.757 --rc geninfo_all_blocks=1 00:03:29.757 --no-external' 00:03:29.757 11:44:43 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:29.757 lcov: LCOV version 1.14 00:03:29.757 11:44:43 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:41.967 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:41.967 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:03:51.975 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:03:51.975 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:03:51.975 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:03:51.975 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:03:51.975 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:03:51.976 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:03:51.976 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:03:51.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:03:51.977 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:52.274 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:52.274 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:52.275 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:52.275 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:52.275 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:52.275 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:52.275 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:52.275 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:52.275 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:52.275 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:55.572 11:45:08 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:55.572 11:45:08 -- common/autotest_common.sh@722 -- # xtrace_disable 00:03:55.572 11:45:08 -- common/autotest_common.sh@10 -- # set +x 00:03:55.572 11:45:08 -- spdk/autotest.sh@91 -- # rm -f 00:03:55.572 11:45:08 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:58.861 0000:5e:00.0 (8086 0a54): Already using the nvme driver 00:03:58.861 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:58.861 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:59.119 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:59.119 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:59.119 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:59.119 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:59.119 11:45:12 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:59.119 11:45:12 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:03:59.119 11:45:12 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:03:59.120 11:45:12 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:03:59.120 11:45:12 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:03:59.120 11:45:12 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:03:59.120 11:45:12 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:03:59.120 11:45:12 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:59.120 11:45:12 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:03:59.120 11:45:12 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:03:59.120 11:45:12 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:59.120 11:45:12 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:59.120 11:45:12 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:59.120 11:45:12 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:59.120 11:45:12 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:59.120 No valid GPT data, bailing 00:03:59.120 11:45:12 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:59.378 11:45:12 -- scripts/common.sh@391 -- # pt= 00:03:59.378 11:45:12 -- scripts/common.sh@392 -- # return 1 00:03:59.378 11:45:12 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:59.378 1+0 records in 00:03:59.378 1+0 records out 00:03:59.378 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00643695 s, 163 MB/s 00:03:59.378 11:45:12 -- spdk/autotest.sh@118 -- # sync 00:03:59.378 11:45:12 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:59.378 11:45:12 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:59.378 11:45:12 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:04.650 11:45:17 -- spdk/autotest.sh@124 -- # uname -s 00:04:04.650 11:45:17 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:04:04.650 11:45:17 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:04.650 11:45:17 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:04.650 11:45:17 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:04.650 11:45:17 -- common/autotest_common.sh@10 -- # set +x 00:04:04.650 ************************************ 00:04:04.650 START TEST setup.sh 00:04:04.650 ************************************ 00:04:04.650 11:45:17 setup.sh -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:04.650 * Looking for test storage... 00:04:04.650 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:04.650 11:45:17 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:04.650 11:45:17 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:04.650 11:45:17 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:04.650 11:45:17 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:04.650 11:45:17 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:04.650 11:45:17 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:04.650 ************************************ 00:04:04.650 START TEST acl 00:04:04.650 ************************************ 00:04:04.650 11:45:18 setup.sh.acl -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:04.650 * Looking for test storage... 00:04:04.650 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:04.650 11:45:18 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:04.650 11:45:18 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:04.650 11:45:18 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:04.650 11:45:18 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:04.650 11:45:18 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:04.650 11:45:18 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:04.650 11:45:18 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:04.650 11:45:18 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:04.650 11:45:18 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:04.650 11:45:18 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:04.650 11:45:18 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:04.650 11:45:18 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:04.650 11:45:18 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:04.650 11:45:18 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:04.650 11:45:18 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:04.650 11:45:18 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:08.839 11:45:22 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:08.839 11:45:22 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:08.839 11:45:22 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:08.839 11:45:22 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:08.839 11:45:22 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:08.839 11:45:22 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:13.033 Hugepages 00:04:13.033 node hugesize free / total 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 00:04:13.033 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:25 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:13.033 11:45:26 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:13.033 11:45:26 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:13.033 11:45:26 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:13.033 11:45:26 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:13.033 ************************************ 00:04:13.033 START TEST denied 00:04:13.033 ************************************ 00:04:13.034 11:45:26 setup.sh.acl.denied -- common/autotest_common.sh@1123 -- # denied 00:04:13.034 11:45:26 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:5e:00.0' 00:04:13.034 11:45:26 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:13.034 11:45:26 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:04:13.034 11:45:26 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:13.034 11:45:26 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:17.224 0000:5e:00.0 (8086 0a54): Skipping denied controller at 0000:5e:00.0 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:17.224 11:45:30 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:22.501 00:04:22.501 real 0m9.118s 00:04:22.501 user 0m2.929s 00:04:22.501 sys 0m5.493s 00:04:22.501 11:45:35 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:22.501 11:45:35 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:22.501 ************************************ 00:04:22.501 END TEST denied 00:04:22.501 ************************************ 00:04:22.501 11:45:35 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:22.501 11:45:35 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:22.501 11:45:35 setup.sh.acl -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:22.501 11:45:35 setup.sh.acl -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:22.501 11:45:35 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:22.501 ************************************ 00:04:22.501 START TEST allowed 00:04:22.501 ************************************ 00:04:22.501 11:45:35 setup.sh.acl.allowed -- common/autotest_common.sh@1123 -- # allowed 00:04:22.501 11:45:35 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:04:22.501 11:45:35 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:22.501 11:45:35 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:04:22.501 11:45:35 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.502 11:45:35 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:32.484 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:32.484 11:45:44 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:32.484 11:45:44 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:32.484 11:45:44 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:32.484 11:45:44 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:32.484 11:45:44 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:35.020 00:04:35.020 real 0m13.045s 00:04:35.020 user 0m2.839s 00:04:35.020 sys 0m5.409s 00:04:35.020 11:45:48 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:35.020 11:45:48 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:35.020 ************************************ 00:04:35.020 END TEST allowed 00:04:35.020 ************************************ 00:04:35.020 11:45:48 setup.sh.acl -- common/autotest_common.sh@1142 -- # return 0 00:04:35.020 00:04:35.020 real 0m30.525s 00:04:35.020 user 0m8.691s 00:04:35.020 sys 0m16.660s 00:04:35.020 11:45:48 setup.sh.acl -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:35.020 11:45:48 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:35.020 ************************************ 00:04:35.020 END TEST acl 00:04:35.020 ************************************ 00:04:35.020 11:45:48 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:04:35.020 11:45:48 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:35.020 11:45:48 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:35.021 11:45:48 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:35.021 11:45:48 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:35.281 ************************************ 00:04:35.281 START TEST hugepages 00:04:35.281 ************************************ 00:04:35.281 11:45:48 setup.sh.hugepages -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:35.281 * Looking for test storage... 00:04:35.281 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 72666180 kB' 'MemAvailable: 76024744 kB' 'Buffers: 12156 kB' 'Cached: 13577016 kB' 'SwapCached: 0 kB' 'Active: 10626964 kB' 'Inactive: 3473704 kB' 'Active(anon): 10189776 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514816 kB' 'Mapped: 167428 kB' 'Shmem: 9678280 kB' 'KReclaimable: 202184 kB' 'Slab: 507812 kB' 'SReclaimable: 202184 kB' 'SUnreclaim: 305628 kB' 'KernelStack: 16192 kB' 'PageTables: 7816 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52438204 kB' 'Committed_AS: 11569316 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200868 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.281 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:35.282 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:35.283 11:45:48 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:35.283 11:45:48 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:35.283 11:45:48 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:35.283 11:45:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:35.283 ************************************ 00:04:35.283 START TEST default_setup 00:04:35.283 ************************************ 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1123 -- # default_setup 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:35.283 11:45:48 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:39.474 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:39.474 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:44.866 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74819508 kB' 'MemAvailable: 78178024 kB' 'Buffers: 12156 kB' 'Cached: 13577156 kB' 'SwapCached: 0 kB' 'Active: 10645156 kB' 'Inactive: 3473704 kB' 'Active(anon): 10207968 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532852 kB' 'Mapped: 167428 kB' 'Shmem: 9678420 kB' 'KReclaimable: 202088 kB' 'Slab: 506556 kB' 'SReclaimable: 202088 kB' 'SUnreclaim: 304468 kB' 'KernelStack: 16256 kB' 'PageTables: 8424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11587768 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200932 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.866 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74819076 kB' 'MemAvailable: 78177592 kB' 'Buffers: 12156 kB' 'Cached: 13577156 kB' 'SwapCached: 0 kB' 'Active: 10645308 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208120 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533080 kB' 'Mapped: 167308 kB' 'Shmem: 9678420 kB' 'KReclaimable: 202088 kB' 'Slab: 506540 kB' 'SReclaimable: 202088 kB' 'SUnreclaim: 304452 kB' 'KernelStack: 16432 kB' 'PageTables: 8392 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11589272 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.867 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.868 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74817076 kB' 'MemAvailable: 78175592 kB' 'Buffers: 12156 kB' 'Cached: 13577172 kB' 'SwapCached: 0 kB' 'Active: 10645240 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208052 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533032 kB' 'Mapped: 167380 kB' 'Shmem: 9678436 kB' 'KReclaimable: 202088 kB' 'Slab: 506552 kB' 'SReclaimable: 202088 kB' 'SUnreclaim: 304464 kB' 'KernelStack: 16288 kB' 'PageTables: 8220 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11587808 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200884 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.869 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.870 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:44.871 nr_hugepages=1024 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:44.871 resv_hugepages=0 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:44.871 surplus_hugepages=0 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:44.871 anon_hugepages=0 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74815880 kB' 'MemAvailable: 78174396 kB' 'Buffers: 12156 kB' 'Cached: 13577192 kB' 'SwapCached: 0 kB' 'Active: 10645420 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208232 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533164 kB' 'Mapped: 167380 kB' 'Shmem: 9678456 kB' 'KReclaimable: 202088 kB' 'Slab: 506552 kB' 'SReclaimable: 202088 kB' 'SUnreclaim: 304464 kB' 'KernelStack: 16448 kB' 'PageTables: 8760 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11589156 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200980 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.871 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.872 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069892 kB' 'MemFree: 41979856 kB' 'MemUsed: 6090036 kB' 'SwapCached: 0 kB' 'Active: 3237548 kB' 'Inactive: 100732 kB' 'Active(anon): 3043888 kB' 'Inactive(anon): 0 kB' 'Active(file): 193660 kB' 'Inactive(file): 100732 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3033840 kB' 'Mapped: 89100 kB' 'AnonPages: 307784 kB' 'Shmem: 2739448 kB' 'KernelStack: 8568 kB' 'PageTables: 4240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80328 kB' 'Slab: 235344 kB' 'SReclaimable: 80328 kB' 'SUnreclaim: 155016 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.873 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:44.874 node0=1024 expecting 1024 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:44.874 00:04:44.874 real 0m9.075s 00:04:44.874 user 0m1.673s 00:04:44.874 sys 0m2.642s 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:44.874 11:45:57 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:44.874 ************************************ 00:04:44.874 END TEST default_setup 00:04:44.874 ************************************ 00:04:44.874 11:45:57 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:44.874 11:45:57 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:44.874 11:45:57 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:44.874 11:45:57 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:44.874 11:45:57 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:44.874 ************************************ 00:04:44.874 START TEST per_node_1G_alloc 00:04:44.874 ************************************ 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1123 -- # per_node_1G_alloc 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:44.874 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:44.875 11:45:58 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:48.219 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:48.219 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:48.219 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.219 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74808492 kB' 'MemAvailable: 78167008 kB' 'Buffers: 12156 kB' 'Cached: 13577296 kB' 'SwapCached: 0 kB' 'Active: 10644432 kB' 'Inactive: 3473704 kB' 'Active(anon): 10207244 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532012 kB' 'Mapped: 166436 kB' 'Shmem: 9678560 kB' 'KReclaimable: 202088 kB' 'Slab: 506348 kB' 'SReclaimable: 202088 kB' 'SUnreclaim: 304260 kB' 'KernelStack: 16336 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11581904 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201140 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.220 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.221 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.486 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74808020 kB' 'MemAvailable: 78166536 kB' 'Buffers: 12156 kB' 'Cached: 13577296 kB' 'SwapCached: 0 kB' 'Active: 10645132 kB' 'Inactive: 3473704 kB' 'Active(anon): 10207944 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532632 kB' 'Mapped: 166428 kB' 'Shmem: 9678560 kB' 'KReclaimable: 202088 kB' 'Slab: 506340 kB' 'SReclaimable: 202088 kB' 'SUnreclaim: 304252 kB' 'KernelStack: 16400 kB' 'PageTables: 8352 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11581756 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201140 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.487 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74808920 kB' 'MemAvailable: 78167436 kB' 'Buffers: 12156 kB' 'Cached: 13577316 kB' 'SwapCached: 0 kB' 'Active: 10644716 kB' 'Inactive: 3473704 kB' 'Active(anon): 10207528 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532228 kB' 'Mapped: 166484 kB' 'Shmem: 9678580 kB' 'KReclaimable: 202088 kB' 'Slab: 506416 kB' 'SReclaimable: 202088 kB' 'SUnreclaim: 304328 kB' 'KernelStack: 16224 kB' 'PageTables: 8052 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11580452 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201012 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.488 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.489 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:48.490 nr_hugepages=1024 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:48.490 resv_hugepages=0 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:48.490 surplus_hugepages=0 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:48.490 anon_hugepages=0 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74809048 kB' 'MemAvailable: 78167564 kB' 'Buffers: 12156 kB' 'Cached: 13577320 kB' 'SwapCached: 0 kB' 'Active: 10644400 kB' 'Inactive: 3473704 kB' 'Active(anon): 10207212 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531908 kB' 'Mapped: 166476 kB' 'Shmem: 9678584 kB' 'KReclaimable: 202088 kB' 'Slab: 506416 kB' 'SReclaimable: 202088 kB' 'SUnreclaim: 304328 kB' 'KernelStack: 16192 kB' 'PageTables: 7916 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11581968 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201012 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.490 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.491 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069892 kB' 'MemFree: 43028148 kB' 'MemUsed: 5041744 kB' 'SwapCached: 0 kB' 'Active: 3237876 kB' 'Inactive: 100732 kB' 'Active(anon): 3044216 kB' 'Inactive(anon): 0 kB' 'Active(file): 193660 kB' 'Inactive(file): 100732 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3033976 kB' 'Mapped: 88804 kB' 'AnonPages: 307820 kB' 'Shmem: 2739584 kB' 'KernelStack: 8440 kB' 'PageTables: 3680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80328 kB' 'Slab: 235348 kB' 'SReclaimable: 80328 kB' 'SUnreclaim: 155020 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.492 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:01 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.493 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223612 kB' 'MemFree: 31781012 kB' 'MemUsed: 12442600 kB' 'SwapCached: 0 kB' 'Active: 7407292 kB' 'Inactive: 3372972 kB' 'Active(anon): 7163764 kB' 'Inactive(anon): 0 kB' 'Active(file): 243528 kB' 'Inactive(file): 3372972 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10555540 kB' 'Mapped: 77680 kB' 'AnonPages: 224896 kB' 'Shmem: 6939040 kB' 'KernelStack: 7752 kB' 'PageTables: 4504 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121760 kB' 'Slab: 271004 kB' 'SReclaimable: 121760 kB' 'SUnreclaim: 149244 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.494 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:48.495 node0=512 expecting 512 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:48.495 node1=512 expecting 512 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:48.495 00:04:48.495 real 0m4.055s 00:04:48.495 user 0m1.555s 00:04:48.495 sys 0m2.603s 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:48.495 11:46:02 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:48.495 ************************************ 00:04:48.495 END TEST per_node_1G_alloc 00:04:48.495 ************************************ 00:04:48.755 11:46:02 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:48.755 11:46:02 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:48.755 11:46:02 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:48.755 11:46:02 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:48.755 11:46:02 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:48.755 ************************************ 00:04:48.755 START TEST even_2G_alloc 00:04:48.755 ************************************ 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1123 -- # even_2G_alloc 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:48.755 11:46:02 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:52.955 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:52.955 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:52.955 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74780516 kB' 'MemAvailable: 78139016 kB' 'Buffers: 12156 kB' 'Cached: 13577452 kB' 'SwapCached: 0 kB' 'Active: 10645684 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208496 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533132 kB' 'Mapped: 166456 kB' 'Shmem: 9678716 kB' 'KReclaimable: 202056 kB' 'Slab: 506380 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304324 kB' 'KernelStack: 16304 kB' 'PageTables: 8120 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11582704 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.955 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:52.956 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74781788 kB' 'MemAvailable: 78140288 kB' 'Buffers: 12156 kB' 'Cached: 13577456 kB' 'SwapCached: 0 kB' 'Active: 10645976 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208788 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533468 kB' 'Mapped: 166516 kB' 'Shmem: 9678720 kB' 'KReclaimable: 202056 kB' 'Slab: 506424 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304368 kB' 'KernelStack: 16144 kB' 'PageTables: 7860 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11585416 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200980 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.957 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:52.958 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74782500 kB' 'MemAvailable: 78141000 kB' 'Buffers: 12156 kB' 'Cached: 13577472 kB' 'SwapCached: 0 kB' 'Active: 10645088 kB' 'Inactive: 3473704 kB' 'Active(anon): 10207900 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533024 kB' 'Mapped: 166528 kB' 'Shmem: 9678736 kB' 'KReclaimable: 202056 kB' 'Slab: 506396 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304340 kB' 'KernelStack: 16192 kB' 'PageTables: 7764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11582744 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200948 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:05 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.959 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:52.960 nr_hugepages=1024 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:52.960 resv_hugepages=0 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:52.960 surplus_hugepages=0 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:52.960 anon_hugepages=0 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:52.960 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74781924 kB' 'MemAvailable: 78140424 kB' 'Buffers: 12156 kB' 'Cached: 13577492 kB' 'SwapCached: 0 kB' 'Active: 10645416 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208228 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532856 kB' 'Mapped: 166528 kB' 'Shmem: 9678756 kB' 'KReclaimable: 202056 kB' 'Slab: 506400 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304344 kB' 'KernelStack: 16128 kB' 'PageTables: 7900 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11582764 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200916 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.961 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.962 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069892 kB' 'MemFree: 43011112 kB' 'MemUsed: 5058780 kB' 'SwapCached: 0 kB' 'Active: 3236880 kB' 'Inactive: 100732 kB' 'Active(anon): 3043220 kB' 'Inactive(anon): 0 kB' 'Active(file): 193660 kB' 'Inactive(file): 100732 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3034096 kB' 'Mapped: 88820 kB' 'AnonPages: 306792 kB' 'Shmem: 2739704 kB' 'KernelStack: 8424 kB' 'PageTables: 3632 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80328 kB' 'Slab: 235284 kB' 'SReclaimable: 80328 kB' 'SUnreclaim: 154956 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.963 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223612 kB' 'MemFree: 31770284 kB' 'MemUsed: 12453328 kB' 'SwapCached: 0 kB' 'Active: 7408520 kB' 'Inactive: 3372972 kB' 'Active(anon): 7164992 kB' 'Inactive(anon): 0 kB' 'Active(file): 243528 kB' 'Inactive(file): 3372972 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10555576 kB' 'Mapped: 77688 kB' 'AnonPages: 226076 kB' 'Shmem: 6939076 kB' 'KernelStack: 7832 kB' 'PageTables: 4312 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121728 kB' 'Slab: 271120 kB' 'SReclaimable: 121728 kB' 'SUnreclaim: 149392 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.964 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:52.965 node0=512 expecting 512 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:52.965 node1=512 expecting 512 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:52.965 00:04:52.965 real 0m3.985s 00:04:52.965 user 0m1.539s 00:04:52.965 sys 0m2.550s 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:52.965 11:46:06 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:52.965 ************************************ 00:04:52.965 END TEST even_2G_alloc 00:04:52.965 ************************************ 00:04:52.965 11:46:06 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:52.965 11:46:06 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:52.965 11:46:06 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:52.965 11:46:06 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:52.965 11:46:06 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:52.965 ************************************ 00:04:52.965 START TEST odd_alloc 00:04:52.965 ************************************ 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1123 -- # odd_alloc 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:52.965 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:52.966 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:52.966 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:52.966 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:52.966 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:52.966 11:46:06 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:52.966 11:46:06 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:52.966 11:46:06 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:56.253 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:56.253 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:04:56.253 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:04:56.517 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:04:56.517 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:04:56.517 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:04:56.517 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:04:56.517 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74766432 kB' 'MemAvailable: 78124932 kB' 'Buffers: 12156 kB' 'Cached: 13577600 kB' 'SwapCached: 0 kB' 'Active: 10645688 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208500 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532924 kB' 'Mapped: 166460 kB' 'Shmem: 9678864 kB' 'KReclaimable: 202056 kB' 'Slab: 506344 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304288 kB' 'KernelStack: 16080 kB' 'PageTables: 7744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 11583108 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201028 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.517 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:09 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74770020 kB' 'MemAvailable: 78128520 kB' 'Buffers: 12156 kB' 'Cached: 13577604 kB' 'SwapCached: 0 kB' 'Active: 10646300 kB' 'Inactive: 3473704 kB' 'Active(anon): 10209112 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533688 kB' 'Mapped: 166508 kB' 'Shmem: 9678868 kB' 'KReclaimable: 202056 kB' 'Slab: 506372 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304316 kB' 'KernelStack: 16128 kB' 'PageTables: 8212 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 11586280 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201076 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.518 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.519 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74774100 kB' 'MemAvailable: 78132600 kB' 'Buffers: 12156 kB' 'Cached: 13577620 kB' 'SwapCached: 0 kB' 'Active: 10647456 kB' 'Inactive: 3473704 kB' 'Active(anon): 10210268 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 534812 kB' 'Mapped: 167012 kB' 'Shmem: 9678884 kB' 'KReclaimable: 202056 kB' 'Slab: 506372 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304316 kB' 'KernelStack: 16256 kB' 'PageTables: 8384 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 11585028 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200916 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.520 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.521 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:04:56.522 nr_hugepages=1025 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:56.522 resv_hugepages=0 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:56.522 surplus_hugepages=0 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:56.522 anon_hugepages=0 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74774792 kB' 'MemAvailable: 78133292 kB' 'Buffers: 12156 kB' 'Cached: 13577640 kB' 'SwapCached: 0 kB' 'Active: 10649384 kB' 'Inactive: 3473704 kB' 'Active(anon): 10212196 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 536700 kB' 'Mapped: 167308 kB' 'Shmem: 9678904 kB' 'KReclaimable: 202056 kB' 'Slab: 506372 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304316 kB' 'KernelStack: 16464 kB' 'PageTables: 8420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53485756 kB' 'Committed_AS: 11587164 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.522 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.523 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.790 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069892 kB' 'MemFree: 43004524 kB' 'MemUsed: 5065368 kB' 'SwapCached: 0 kB' 'Active: 3243172 kB' 'Inactive: 100732 kB' 'Active(anon): 3049512 kB' 'Inactive(anon): 0 kB' 'Active(file): 193660 kB' 'Inactive(file): 100732 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3034208 kB' 'Mapped: 88836 kB' 'AnonPages: 313452 kB' 'Shmem: 2739816 kB' 'KernelStack: 8408 kB' 'PageTables: 3644 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80328 kB' 'Slab: 235452 kB' 'SReclaimable: 80328 kB' 'SUnreclaim: 155124 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.791 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223612 kB' 'MemFree: 31772872 kB' 'MemUsed: 12450740 kB' 'SwapCached: 0 kB' 'Active: 7407904 kB' 'Inactive: 3372972 kB' 'Active(anon): 7164376 kB' 'Inactive(anon): 0 kB' 'Active(file): 243528 kB' 'Inactive(file): 3372972 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10555588 kB' 'Mapped: 78056 kB' 'AnonPages: 225356 kB' 'Shmem: 6939088 kB' 'KernelStack: 7864 kB' 'PageTables: 4544 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121728 kB' 'Slab: 270920 kB' 'SReclaimable: 121728 kB' 'SUnreclaim: 149192 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.792 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:04:56.793 node0=512 expecting 513 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:56.793 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:04:56.794 node1=513 expecting 512 00:04:56.794 11:46:10 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:04:56.794 00:04:56.794 real 0m3.952s 00:04:56.794 user 0m1.569s 00:04:56.794 sys 0m2.487s 00:04:56.794 11:46:10 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:04:56.794 11:46:10 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:56.794 ************************************ 00:04:56.794 END TEST odd_alloc 00:04:56.794 ************************************ 00:04:56.794 11:46:10 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:04:56.794 11:46:10 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:04:56.794 11:46:10 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:04:56.794 11:46:10 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:04:56.794 11:46:10 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:56.794 ************************************ 00:04:56.794 START TEST custom_alloc 00:04:56.794 ************************************ 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1123 -- # custom_alloc 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:56.794 11:46:10 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:00.995 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:00.995 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:00.995 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:00.995 11:46:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:05:00.995 11:46:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:05:00.995 11:46:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:05:00.995 11:46:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:00.995 11:46:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:00.995 11:46:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:00.995 11:46:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:00.995 11:46:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:00.995 11:46:13 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 73715468 kB' 'MemAvailable: 77073968 kB' 'Buffers: 12156 kB' 'Cached: 13577752 kB' 'SwapCached: 0 kB' 'Active: 10644952 kB' 'Inactive: 3473704 kB' 'Active(anon): 10207764 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531896 kB' 'Mapped: 166588 kB' 'Shmem: 9679016 kB' 'KReclaimable: 202056 kB' 'Slab: 506452 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304396 kB' 'KernelStack: 16080 kB' 'PageTables: 7732 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 11581032 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 201012 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.995 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:00.996 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 73721000 kB' 'MemAvailable: 77079500 kB' 'Buffers: 12156 kB' 'Cached: 13577752 kB' 'SwapCached: 0 kB' 'Active: 10645636 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208448 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532672 kB' 'Mapped: 166512 kB' 'Shmem: 9679016 kB' 'KReclaimable: 202056 kB' 'Slab: 506432 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304376 kB' 'KernelStack: 16112 kB' 'PageTables: 7840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 11583664 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200980 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.997 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.998 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 73723016 kB' 'MemAvailable: 77081516 kB' 'Buffers: 12156 kB' 'Cached: 13577784 kB' 'SwapCached: 0 kB' 'Active: 10645536 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208348 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532620 kB' 'Mapped: 166516 kB' 'Shmem: 9679048 kB' 'KReclaimable: 202056 kB' 'Slab: 506432 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304376 kB' 'KernelStack: 16208 kB' 'PageTables: 7912 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 11585448 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200996 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:00.999 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:05:01.000 nr_hugepages=1536 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:01.000 resv_hugepages=0 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:01.000 surplus_hugepages=0 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:01.000 anon_hugepages=0 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.000 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 73723416 kB' 'MemAvailable: 77081916 kB' 'Buffers: 12156 kB' 'Cached: 13577792 kB' 'SwapCached: 0 kB' 'Active: 10645924 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208736 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533020 kB' 'Mapped: 166524 kB' 'Shmem: 9679056 kB' 'KReclaimable: 202056 kB' 'Slab: 506424 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304368 kB' 'KernelStack: 16352 kB' 'PageTables: 8420 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52962492 kB' 'Committed_AS: 11583708 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200980 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.001 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069892 kB' 'MemFree: 43010916 kB' 'MemUsed: 5058976 kB' 'SwapCached: 0 kB' 'Active: 3237892 kB' 'Inactive: 100732 kB' 'Active(anon): 3044232 kB' 'Inactive(anon): 0 kB' 'Active(file): 193660 kB' 'Inactive(file): 100732 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3034360 kB' 'Mapped: 88844 kB' 'AnonPages: 307492 kB' 'Shmem: 2739968 kB' 'KernelStack: 8408 kB' 'PageTables: 3572 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80328 kB' 'Slab: 235316 kB' 'SReclaimable: 80328 kB' 'SUnreclaim: 154988 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.002 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.003 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 44223612 kB' 'MemFree: 30711296 kB' 'MemUsed: 13512316 kB' 'SwapCached: 0 kB' 'Active: 7407360 kB' 'Inactive: 3372972 kB' 'Active(anon): 7163832 kB' 'Inactive(anon): 0 kB' 'Active(file): 243528 kB' 'Inactive(file): 3372972 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 10555592 kB' 'Mapped: 77672 kB' 'AnonPages: 224768 kB' 'Shmem: 6939092 kB' 'KernelStack: 7768 kB' 'PageTables: 4188 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 121728 kB' 'Slab: 271108 kB' 'SReclaimable: 121728 kB' 'SUnreclaim: 149380 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.004 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:01.005 node0=512 expecting 512 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:05:01.005 node1=1024 expecting 1024 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:01.005 00:05:01.005 real 0m4.061s 00:05:01.005 user 0m1.592s 00:05:01.005 sys 0m2.574s 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:01.005 11:46:14 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:01.005 ************************************ 00:05:01.005 END TEST custom_alloc 00:05:01.005 ************************************ 00:05:01.005 11:46:14 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:01.005 11:46:14 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:01.005 11:46:14 setup.sh.hugepages -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:01.005 11:46:14 setup.sh.hugepages -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:01.005 11:46:14 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:01.005 ************************************ 00:05:01.005 START TEST no_shrink_alloc 00:05:01.005 ************************************ 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1123 -- # no_shrink_alloc 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:01.005 11:46:14 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:05.205 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:05.205 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:05.205 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74766164 kB' 'MemAvailable: 78124664 kB' 'Buffers: 12156 kB' 'Cached: 13577896 kB' 'SwapCached: 0 kB' 'Active: 10645540 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208352 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 531900 kB' 'Mapped: 166616 kB' 'Shmem: 9679160 kB' 'KReclaimable: 202056 kB' 'Slab: 506316 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304260 kB' 'KernelStack: 16064 kB' 'PageTables: 7608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11583976 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200868 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.205 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.206 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74767152 kB' 'MemAvailable: 78125652 kB' 'Buffers: 12156 kB' 'Cached: 13577900 kB' 'SwapCached: 0 kB' 'Active: 10645552 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208364 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532468 kB' 'Mapped: 166548 kB' 'Shmem: 9679164 kB' 'KReclaimable: 202056 kB' 'Slab: 506288 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304232 kB' 'KernelStack: 16032 kB' 'PageTables: 7588 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11582500 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200804 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.207 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.208 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74767412 kB' 'MemAvailable: 78125912 kB' 'Buffers: 12156 kB' 'Cached: 13577900 kB' 'SwapCached: 0 kB' 'Active: 10645820 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208632 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532772 kB' 'Mapped: 166548 kB' 'Shmem: 9679164 kB' 'KReclaimable: 202056 kB' 'Slab: 506288 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304232 kB' 'KernelStack: 16208 kB' 'PageTables: 7868 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11586148 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200900 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.209 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.210 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:05.211 nr_hugepages=1024 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:05.211 resv_hugepages=0 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:05.211 surplus_hugepages=0 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:05.211 anon_hugepages=0 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74766604 kB' 'MemAvailable: 78125104 kB' 'Buffers: 12156 kB' 'Cached: 13577940 kB' 'SwapCached: 0 kB' 'Active: 10645360 kB' 'Inactive: 3473704 kB' 'Active(anon): 10208172 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 532180 kB' 'Mapped: 166548 kB' 'Shmem: 9679204 kB' 'KReclaimable: 202056 kB' 'Slab: 506288 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304232 kB' 'KernelStack: 16112 kB' 'PageTables: 7892 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11584040 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200884 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.211 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:05.212 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069892 kB' 'MemFree: 41971868 kB' 'MemUsed: 6098024 kB' 'SwapCached: 0 kB' 'Active: 3238868 kB' 'Inactive: 100732 kB' 'Active(anon): 3045208 kB' 'Inactive(anon): 0 kB' 'Active(file): 193660 kB' 'Inactive(file): 100732 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3034452 kB' 'Mapped: 88852 kB' 'AnonPages: 308280 kB' 'Shmem: 2740060 kB' 'KernelStack: 8584 kB' 'PageTables: 4076 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80328 kB' 'Slab: 235340 kB' 'SReclaimable: 80328 kB' 'SUnreclaim: 155012 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.213 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:05.214 node0=1024 expecting 1024 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:05.214 11:46:18 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:08.509 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:08.509 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:05:08.509 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:05:08.509 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74757488 kB' 'MemAvailable: 78115988 kB' 'Buffers: 12156 kB' 'Cached: 13578024 kB' 'SwapCached: 0 kB' 'Active: 10650816 kB' 'Inactive: 3473704 kB' 'Active(anon): 10213628 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 537044 kB' 'Mapped: 167108 kB' 'Shmem: 9679288 kB' 'KReclaimable: 202056 kB' 'Slab: 506416 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304360 kB' 'KernelStack: 16112 kB' 'PageTables: 7768 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11586400 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200884 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.509 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.510 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74759680 kB' 'MemAvailable: 78118180 kB' 'Buffers: 12156 kB' 'Cached: 13578028 kB' 'SwapCached: 0 kB' 'Active: 10646696 kB' 'Inactive: 3473704 kB' 'Active(anon): 10209508 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533416 kB' 'Mapped: 166812 kB' 'Shmem: 9679292 kB' 'KReclaimable: 202056 kB' 'Slab: 506424 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304368 kB' 'KernelStack: 16128 kB' 'PageTables: 7840 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11581896 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200836 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.511 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.512 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.513 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.513 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.513 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.513 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74760868 kB' 'MemAvailable: 78119368 kB' 'Buffers: 12156 kB' 'Cached: 13578048 kB' 'SwapCached: 0 kB' 'Active: 10646732 kB' 'Inactive: 3473704 kB' 'Active(anon): 10209544 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533492 kB' 'Mapped: 166528 kB' 'Shmem: 9679312 kB' 'KReclaimable: 202056 kB' 'Slab: 506424 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304368 kB' 'KernelStack: 16096 kB' 'PageTables: 7788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11581916 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200836 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.775 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.776 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:08.777 nr_hugepages=1024 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:08.777 resv_hugepages=0 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:08.777 surplus_hugepages=0 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:08.777 anon_hugepages=0 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 92293504 kB' 'MemFree: 74761188 kB' 'MemAvailable: 78119688 kB' 'Buffers: 12156 kB' 'Cached: 13578072 kB' 'SwapCached: 0 kB' 'Active: 10646724 kB' 'Inactive: 3473704 kB' 'Active(anon): 10209536 kB' 'Inactive(anon): 0 kB' 'Active(file): 437188 kB' 'Inactive(file): 3473704 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 533492 kB' 'Mapped: 166528 kB' 'Shmem: 9679336 kB' 'KReclaimable: 202056 kB' 'Slab: 506424 kB' 'SReclaimable: 202056 kB' 'SUnreclaim: 304368 kB' 'KernelStack: 16096 kB' 'PageTables: 7788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53486780 kB' 'Committed_AS: 11581944 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 200836 kB' 'VmallocChunk: 0 kB' 'Percpu: 54080 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 550316 kB' 'DirectMap2M: 15902720 kB' 'DirectMap1G: 84934656 kB' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.777 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.778 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 48069892 kB' 'MemFree: 41967588 kB' 'MemUsed: 6102304 kB' 'SwapCached: 0 kB' 'Active: 3239416 kB' 'Inactive: 100732 kB' 'Active(anon): 3045756 kB' 'Inactive(anon): 0 kB' 'Active(file): 193660 kB' 'Inactive(file): 100732 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 3034484 kB' 'Mapped: 88860 kB' 'AnonPages: 308836 kB' 'Shmem: 2740092 kB' 'KernelStack: 8408 kB' 'PageTables: 3612 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 80328 kB' 'Slab: 235560 kB' 'SReclaimable: 80328 kB' 'SUnreclaim: 155232 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.779 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.780 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:08.781 node0=1024 expecting 1024 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:08.781 00:05:08.781 real 0m7.782s 00:05:08.781 user 0m2.919s 00:05:08.781 sys 0m5.071s 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:08.781 11:46:22 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:08.781 ************************************ 00:05:08.781 END TEST no_shrink_alloc 00:05:08.781 ************************************ 00:05:08.781 11:46:22 setup.sh.hugepages -- common/autotest_common.sh@1142 -- # return 0 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:08.781 11:46:22 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:08.781 00:05:08.781 real 0m33.631s 00:05:08.781 user 0m11.134s 00:05:08.781 sys 0m18.413s 00:05:08.781 11:46:22 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:08.781 11:46:22 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:08.781 ************************************ 00:05:08.781 END TEST hugepages 00:05:08.781 ************************************ 00:05:08.781 11:46:22 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:08.781 11:46:22 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:08.781 11:46:22 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:08.781 11:46:22 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:08.781 11:46:22 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:08.781 ************************************ 00:05:08.781 START TEST driver 00:05:08.781 ************************************ 00:05:08.781 11:46:22 setup.sh.driver -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:09.041 * Looking for test storage... 00:05:09.041 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:09.041 11:46:22 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:09.041 11:46:22 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:09.041 11:46:22 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:14.315 11:46:27 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:14.315 11:46:27 setup.sh.driver -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:14.315 11:46:27 setup.sh.driver -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:14.315 11:46:27 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:14.315 ************************************ 00:05:14.315 START TEST guess_driver 00:05:14.315 ************************************ 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- common/autotest_common.sh@1123 -- # guess_driver 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 214 > 0 )) 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:14.315 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:14.315 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:14.315 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:14.315 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:14.315 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:14.315 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:14.315 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:14.315 Looking for driver=vfio-pci 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:14.315 11:46:27 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:14.316 11:46:27 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:18.509 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:18.510 11:46:31 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:23.807 11:46:36 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:23.807 11:46:36 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:23.807 11:46:36 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:23.807 11:46:36 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:23.807 11:46:36 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:23.807 11:46:36 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:23.807 11:46:36 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:29.081 00:05:29.081 real 0m14.026s 00:05:29.081 user 0m2.875s 00:05:29.081 sys 0m5.622s 00:05:29.081 11:46:41 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:29.081 11:46:41 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:29.081 ************************************ 00:05:29.081 END TEST guess_driver 00:05:29.081 ************************************ 00:05:29.081 11:46:41 setup.sh.driver -- common/autotest_common.sh@1142 -- # return 0 00:05:29.081 00:05:29.081 real 0m19.385s 00:05:29.081 user 0m4.382s 00:05:29.081 sys 0m8.702s 00:05:29.081 11:46:41 setup.sh.driver -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:29.081 11:46:41 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:29.081 ************************************ 00:05:29.081 END TEST driver 00:05:29.081 ************************************ 00:05:29.081 11:46:41 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:29.081 11:46:41 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:29.081 11:46:41 setup.sh -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:29.081 11:46:41 setup.sh -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:29.081 11:46:41 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:29.081 ************************************ 00:05:29.081 START TEST devices 00:05:29.081 ************************************ 00:05:29.081 11:46:41 setup.sh.devices -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:29.081 * Looking for test storage... 00:05:29.081 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:29.081 11:46:41 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:29.081 11:46:41 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:29.081 11:46:41 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:29.081 11:46:41 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:32.373 11:46:45 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:32.374 11:46:45 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:32.374 11:46:45 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:32.374 No valid GPT data, bailing 00:05:32.374 11:46:45 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:32.374 11:46:45 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:32.374 11:46:45 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:32.374 11:46:45 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:32.374 11:46:45 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:32.374 11:46:45 setup.sh.devices -- setup/common.sh@80 -- # echo 8001563222016 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@204 -- # (( 8001563222016 >= min_disk_size )) 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:32.374 11:46:45 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:32.374 11:46:45 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:32.374 ************************************ 00:05:32.374 START TEST nvme_mount 00:05:32.374 ************************************ 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1123 -- # nvme_mount 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:32.374 11:46:45 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:33.753 Creating new GPT entries in memory. 00:05:33.753 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:33.753 other utilities. 00:05:33.753 11:46:46 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:33.753 11:46:46 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:33.753 11:46:46 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:33.753 11:46:46 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:33.753 11:46:46 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:34.691 Creating new GPT entries in memory. 00:05:34.691 The operation has completed successfully. 00:05:34.691 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:34.691 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:34.691 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 1390693 00:05:34.691 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.691 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:34.691 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.691 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:34.691 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:34.692 11:46:48 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:38.886 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:38.886 11:46:51 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:38.886 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:38.886 /dev/nvme0n1: 8 bytes were erased at offset 0x74702555e00 (gpt): 45 46 49 20 50 41 52 54 00:05:38.886 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:38.886 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:38.886 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:38.887 11:46:52 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:42.178 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.178 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme0n1 '' '' 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:42.179 11:46:55 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:46.373 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:46.373 00:05:46.373 real 0m13.274s 00:05:46.373 user 0m3.684s 00:05:46.373 sys 0m7.495s 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:46.373 11:46:59 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:46.373 ************************************ 00:05:46.373 END TEST nvme_mount 00:05:46.373 ************************************ 00:05:46.373 11:46:59 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:46.373 11:46:59 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:46.373 11:46:59 setup.sh.devices -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:05:46.373 11:46:59 setup.sh.devices -- common/autotest_common.sh@1105 -- # xtrace_disable 00:05:46.373 11:46:59 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:46.373 ************************************ 00:05:46.373 START TEST dm_mount 00:05:46.373 ************************************ 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- common/autotest_common.sh@1123 -- # dm_mount 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:46.373 11:46:59 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:46.940 Creating new GPT entries in memory. 00:05:46.940 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:46.940 other utilities. 00:05:46.940 11:47:00 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:46.940 11:47:00 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:46.940 11:47:00 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:46.940 11:47:00 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:46.940 11:47:00 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:47.875 Creating new GPT entries in memory. 00:05:47.875 The operation has completed successfully. 00:05:47.875 11:47:01 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:47.875 11:47:01 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:47.875 11:47:01 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:47.875 11:47:01 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:47.875 11:47:01 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:48.808 The operation has completed successfully. 00:05:48.808 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:48.808 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:48.808 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 1394936 00:05:49.067 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:49.068 11:47:02 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:53.262 11:47:06 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:05:56.555 11:47:09 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:05:56.555 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:05:56.555 00:05:56.555 real 0m10.776s 00:05:56.555 user 0m2.726s 00:05:56.555 sys 0m5.172s 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:56.555 11:47:10 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:05:56.555 ************************************ 00:05:56.555 END TEST dm_mount 00:05:56.555 ************************************ 00:05:56.555 11:47:10 setup.sh.devices -- common/autotest_common.sh@1142 -- # return 0 00:05:56.555 11:47:10 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:05:56.555 11:47:10 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:05:56.555 11:47:10 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:56.814 11:47:10 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:56.814 11:47:10 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:56.814 11:47:10 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:56.814 11:47:10 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:57.074 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:57.074 /dev/nvme0n1: 8 bytes were erased at offset 0x74702555e00 (gpt): 45 46 49 20 50 41 52 54 00:05:57.074 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:57.074 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:57.074 11:47:10 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:05:57.074 11:47:10 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:57.074 11:47:10 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:05:57.074 11:47:10 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:57.074 11:47:10 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:05:57.074 11:47:10 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:05:57.074 11:47:10 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:05:57.074 00:05:57.074 real 0m28.636s 00:05:57.074 user 0m7.924s 00:05:57.074 sys 0m15.645s 00:05:57.074 11:47:10 setup.sh.devices -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:57.074 11:47:10 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:57.074 ************************************ 00:05:57.074 END TEST devices 00:05:57.074 ************************************ 00:05:57.074 11:47:10 setup.sh -- common/autotest_common.sh@1142 -- # return 0 00:05:57.074 00:05:57.074 real 1m52.640s 00:05:57.074 user 0m32.294s 00:05:57.074 sys 0m59.757s 00:05:57.074 11:47:10 setup.sh -- common/autotest_common.sh@1124 -- # xtrace_disable 00:05:57.074 11:47:10 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:57.074 ************************************ 00:05:57.074 END TEST setup.sh 00:05:57.074 ************************************ 00:05:57.074 11:47:10 -- common/autotest_common.sh@1142 -- # return 0 00:05:57.074 11:47:10 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:06:00.368 Hugepages 00:06:00.368 node hugesize free / total 00:06:00.368 node0 1048576kB 0 / 0 00:06:00.628 node0 2048kB 1024 / 1024 00:06:00.628 node1 1048576kB 0 / 0 00:06:00.628 node1 2048kB 1024 / 1024 00:06:00.628 00:06:00.628 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:00.628 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:06:00.628 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:06:00.628 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:06:00.628 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:06:00.628 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:06:00.628 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:06:00.628 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:06:00.628 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:06:00.628 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:06:00.628 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:06:00.628 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:06:00.628 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:06:00.628 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:06:00.628 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:06:00.628 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:06:00.628 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:06:00.628 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:06:00.628 11:47:14 -- spdk/autotest.sh@130 -- # uname -s 00:06:00.628 11:47:14 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:06:00.628 11:47:14 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:06:00.628 11:47:14 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:04.818 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:04.818 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:10.202 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:06:10.202 11:47:23 -- common/autotest_common.sh@1532 -- # sleep 1 00:06:10.772 11:47:24 -- common/autotest_common.sh@1533 -- # bdfs=() 00:06:10.772 11:47:24 -- common/autotest_common.sh@1533 -- # local bdfs 00:06:10.772 11:47:24 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:06:10.772 11:47:24 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:06:10.772 11:47:24 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:10.772 11:47:24 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:10.772 11:47:24 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:10.772 11:47:24 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:10.772 11:47:24 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:10.772 11:47:24 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:10.772 11:47:24 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:10.772 11:47:24 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:14.966 Waiting for block devices as requested 00:06:14.966 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:06:14.966 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:14.966 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:14.966 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:14.967 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:14.967 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:14.967 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:14.967 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:14.967 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:15.226 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:06:15.226 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:06:15.226 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:06:15.485 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:06:15.485 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:06:15.485 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:06:15.746 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:06:15.746 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:06:15.746 11:47:29 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:06:15.746 11:47:29 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:06:15.746 11:47:29 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:06:15.746 11:47:29 -- common/autotest_common.sh@1502 -- # grep 0000:5e:00.0/nvme/nvme 00:06:15.746 11:47:29 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:15.746 11:47:29 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 ]] 00:06:15.747 11:47:29 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme0 00:06:15.747 11:47:29 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:06:15.747 11:47:29 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:06:15.747 11:47:29 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:06:15.747 11:47:29 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:06:15.747 11:47:29 -- common/autotest_common.sh@1545 -- # grep oacs 00:06:15.747 11:47:29 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:06:15.747 11:47:29 -- common/autotest_common.sh@1545 -- # oacs=' 0xe' 00:06:15.747 11:47:29 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:06:15.747 11:47:29 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:06:15.747 11:47:29 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:06:15.747 11:47:29 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:06:15.747 11:47:29 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:06:15.747 11:47:29 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:06:15.747 11:47:29 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:06:15.747 11:47:29 -- common/autotest_common.sh@1557 -- # continue 00:06:15.747 11:47:29 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:06:15.747 11:47:29 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:15.747 11:47:29 -- common/autotest_common.sh@10 -- # set +x 00:06:16.006 11:47:29 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:06:16.006 11:47:29 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:16.006 11:47:29 -- common/autotest_common.sh@10 -- # set +x 00:06:16.006 11:47:29 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:20.200 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:06:20.200 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:06:25.496 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:06:25.496 11:47:38 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:06:25.496 11:47:38 -- common/autotest_common.sh@728 -- # xtrace_disable 00:06:25.496 11:47:38 -- common/autotest_common.sh@10 -- # set +x 00:06:25.496 11:47:38 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:06:25.496 11:47:38 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:06:25.496 11:47:38 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:06:25.496 11:47:38 -- common/autotest_common.sh@1577 -- # bdfs=() 00:06:25.496 11:47:38 -- common/autotest_common.sh@1577 -- # local bdfs 00:06:25.496 11:47:38 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:06:25.496 11:47:38 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:25.496 11:47:38 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:25.496 11:47:38 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:25.496 11:47:38 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:25.496 11:47:38 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:25.496 11:47:38 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:25.496 11:47:38 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:5e:00.0 00:06:25.496 11:47:38 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:06:25.496 11:47:38 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:06:25.496 11:47:38 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:06:25.496 11:47:38 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:25.496 11:47:38 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:06:25.496 11:47:38 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:5e:00.0 00:06:25.496 11:47:38 -- common/autotest_common.sh@1592 -- # [[ -z 0000:5e:00.0 ]] 00:06:25.496 11:47:38 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=1404114 00:06:25.496 11:47:38 -- common/autotest_common.sh@1598 -- # waitforlisten 1404114 00:06:25.496 11:47:38 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:25.496 11:47:38 -- common/autotest_common.sh@829 -- # '[' -z 1404114 ']' 00:06:25.496 11:47:38 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.496 11:47:38 -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:25.496 11:47:38 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.496 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.496 11:47:38 -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:25.496 11:47:38 -- common/autotest_common.sh@10 -- # set +x 00:06:25.496 [2024-07-15 11:47:38.369825] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:06:25.496 [2024-07-15 11:47:38.369901] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1404114 ] 00:06:25.496 [2024-07-15 11:47:38.493031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.496 [2024-07-15 11:47:38.594737] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:25.754 11:47:39 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:25.754 11:47:39 -- common/autotest_common.sh@862 -- # return 0 00:06:25.754 11:47:39 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:06:25.754 11:47:39 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:06:25.754 11:47:39 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:5e:00.0 00:06:29.041 nvme0n1 00:06:29.041 11:47:42 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:29.041 [2024-07-15 11:47:42.611546] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:29.041 request: 00:06:29.041 { 00:06:29.041 "nvme_ctrlr_name": "nvme0", 00:06:29.041 "password": "test", 00:06:29.041 "method": "bdev_nvme_opal_revert", 00:06:29.041 "req_id": 1 00:06:29.041 } 00:06:29.041 Got JSON-RPC error response 00:06:29.041 response: 00:06:29.041 { 00:06:29.041 "code": -32602, 00:06:29.041 "message": "Invalid parameters" 00:06:29.041 } 00:06:29.041 11:47:42 -- common/autotest_common.sh@1604 -- # true 00:06:29.041 11:47:42 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:06:29.041 11:47:42 -- common/autotest_common.sh@1608 -- # killprocess 1404114 00:06:29.041 11:47:42 -- common/autotest_common.sh@948 -- # '[' -z 1404114 ']' 00:06:29.041 11:47:42 -- common/autotest_common.sh@952 -- # kill -0 1404114 00:06:29.299 11:47:42 -- common/autotest_common.sh@953 -- # uname 00:06:29.300 11:47:42 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:29.300 11:47:42 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1404114 00:06:29.300 11:47:42 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:29.300 11:47:42 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:29.300 11:47:42 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1404114' 00:06:29.300 killing process with pid 1404114 00:06:29.300 11:47:42 -- common/autotest_common.sh@967 -- # kill 1404114 00:06:29.300 11:47:42 -- common/autotest_common.sh@972 -- # wait 1404114 00:06:37.456 11:47:49 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:37.456 11:47:49 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:37.456 11:47:49 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:37.456 11:47:49 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:37.456 11:47:49 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:37.456 Restarting all devices. 00:06:40.747 lstat() error: No such file or directory 00:06:40.747 QAT Error: No GENERAL section found 00:06:40.747 Failed to configure qat_dev0 00:06:40.747 lstat() error: No such file or directory 00:06:40.747 QAT Error: No GENERAL section found 00:06:40.747 Failed to configure qat_dev1 00:06:40.747 lstat() error: No such file or directory 00:06:40.747 QAT Error: No GENERAL section found 00:06:40.747 Failed to configure qat_dev2 00:06:40.747 enable sriov 00:06:40.747 Checking status of all devices. 00:06:40.747 There is 3 QAT acceleration device(s) in the system: 00:06:40.747 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:3d:00.0, #accel: 5 #engines: 10 state: down 00:06:40.747 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:3f:00.0, #accel: 5 #engines: 10 state: down 00:06:40.747 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:da:00.0, #accel: 5 #engines: 10 state: down 00:06:41.317 0000:3d:00.0 set to 16 VFs 00:06:42.256 0000:3f:00.0 set to 16 VFs 00:06:42.826 0000:da:00.0 set to 16 VFs 00:06:44.206 Properly configured the qat device with driver uio_pci_generic. 00:06:44.206 11:47:57 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:44.206 11:47:57 -- common/autotest_common.sh@722 -- # xtrace_disable 00:06:44.206 11:47:57 -- common/autotest_common.sh@10 -- # set +x 00:06:44.206 11:47:57 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:44.206 11:47:57 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:44.206 11:47:57 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:44.206 11:47:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.206 11:47:57 -- common/autotest_common.sh@10 -- # set +x 00:06:44.206 ************************************ 00:06:44.206 START TEST env 00:06:44.206 ************************************ 00:06:44.206 11:47:57 env -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:44.465 * Looking for test storage... 00:06:44.465 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:44.465 11:47:57 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:44.465 11:47:57 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:44.465 11:47:57 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.465 11:47:57 env -- common/autotest_common.sh@10 -- # set +x 00:06:44.465 ************************************ 00:06:44.465 START TEST env_memory 00:06:44.465 ************************************ 00:06:44.465 11:47:57 env.env_memory -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:44.465 00:06:44.465 00:06:44.465 CUnit - A unit testing framework for C - Version 2.1-3 00:06:44.465 http://cunit.sourceforge.net/ 00:06:44.465 00:06:44.465 00:06:44.465 Suite: memory 00:06:44.465 Test: alloc and free memory map ...[2024-07-15 11:47:57.985847] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:44.465 passed 00:06:44.465 Test: mem map translation ...[2024-07-15 11:47:58.015274] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:44.465 [2024-07-15 11:47:58.015298] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:44.465 [2024-07-15 11:47:58.015354] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:44.465 [2024-07-15 11:47:58.015367] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:44.725 passed 00:06:44.725 Test: mem map registration ...[2024-07-15 11:47:58.073279] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:44.725 [2024-07-15 11:47:58.073304] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:44.725 passed 00:06:44.725 Test: mem map adjacent registrations ...passed 00:06:44.725 00:06:44.725 Run Summary: Type Total Ran Passed Failed Inactive 00:06:44.726 suites 1 1 n/a 0 0 00:06:44.726 tests 4 4 4 0 0 00:06:44.726 asserts 152 152 152 0 n/a 00:06:44.726 00:06:44.726 Elapsed time = 0.200 seconds 00:06:44.726 00:06:44.726 real 0m0.214s 00:06:44.726 user 0m0.200s 00:06:44.726 sys 0m0.013s 00:06:44.726 11:47:58 env.env_memory -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:44.726 11:47:58 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:44.726 ************************************ 00:06:44.726 END TEST env_memory 00:06:44.726 ************************************ 00:06:44.726 11:47:58 env -- common/autotest_common.sh@1142 -- # return 0 00:06:44.726 11:47:58 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:44.726 11:47:58 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:44.726 11:47:58 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:44.726 11:47:58 env -- common/autotest_common.sh@10 -- # set +x 00:06:44.726 ************************************ 00:06:44.726 START TEST env_vtophys 00:06:44.726 ************************************ 00:06:44.726 11:47:58 env.env_vtophys -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:44.726 EAL: lib.eal log level changed from notice to debug 00:06:44.726 EAL: Detected lcore 0 as core 0 on socket 0 00:06:44.726 EAL: Detected lcore 1 as core 1 on socket 0 00:06:44.726 EAL: Detected lcore 2 as core 2 on socket 0 00:06:44.726 EAL: Detected lcore 3 as core 3 on socket 0 00:06:44.726 EAL: Detected lcore 4 as core 4 on socket 0 00:06:44.726 EAL: Detected lcore 5 as core 8 on socket 0 00:06:44.726 EAL: Detected lcore 6 as core 9 on socket 0 00:06:44.726 EAL: Detected lcore 7 as core 10 on socket 0 00:06:44.726 EAL: Detected lcore 8 as core 11 on socket 0 00:06:44.726 EAL: Detected lcore 9 as core 16 on socket 0 00:06:44.726 EAL: Detected lcore 10 as core 17 on socket 0 00:06:44.726 EAL: Detected lcore 11 as core 18 on socket 0 00:06:44.726 EAL: Detected lcore 12 as core 19 on socket 0 00:06:44.726 EAL: Detected lcore 13 as core 20 on socket 0 00:06:44.726 EAL: Detected lcore 14 as core 24 on socket 0 00:06:44.726 EAL: Detected lcore 15 as core 25 on socket 0 00:06:44.726 EAL: Detected lcore 16 as core 26 on socket 0 00:06:44.726 EAL: Detected lcore 17 as core 27 on socket 0 00:06:44.726 EAL: Detected lcore 18 as core 0 on socket 1 00:06:44.726 EAL: Detected lcore 19 as core 1 on socket 1 00:06:44.726 EAL: Detected lcore 20 as core 2 on socket 1 00:06:44.726 EAL: Detected lcore 21 as core 3 on socket 1 00:06:44.726 EAL: Detected lcore 22 as core 4 on socket 1 00:06:44.726 EAL: Detected lcore 23 as core 8 on socket 1 00:06:44.726 EAL: Detected lcore 24 as core 9 on socket 1 00:06:44.726 EAL: Detected lcore 25 as core 10 on socket 1 00:06:44.726 EAL: Detected lcore 26 as core 11 on socket 1 00:06:44.726 EAL: Detected lcore 27 as core 16 on socket 1 00:06:44.726 EAL: Detected lcore 28 as core 17 on socket 1 00:06:44.726 EAL: Detected lcore 29 as core 18 on socket 1 00:06:44.726 EAL: Detected lcore 30 as core 19 on socket 1 00:06:44.726 EAL: Detected lcore 31 as core 20 on socket 1 00:06:44.726 EAL: Detected lcore 32 as core 24 on socket 1 00:06:44.726 EAL: Detected lcore 33 as core 25 on socket 1 00:06:44.726 EAL: Detected lcore 34 as core 26 on socket 1 00:06:44.726 EAL: Detected lcore 35 as core 27 on socket 1 00:06:44.726 EAL: Detected lcore 36 as core 0 on socket 0 00:06:44.726 EAL: Detected lcore 37 as core 1 on socket 0 00:06:44.726 EAL: Detected lcore 38 as core 2 on socket 0 00:06:44.726 EAL: Detected lcore 39 as core 3 on socket 0 00:06:44.726 EAL: Detected lcore 40 as core 4 on socket 0 00:06:44.726 EAL: Detected lcore 41 as core 8 on socket 0 00:06:44.726 EAL: Detected lcore 42 as core 9 on socket 0 00:06:44.726 EAL: Detected lcore 43 as core 10 on socket 0 00:06:44.726 EAL: Detected lcore 44 as core 11 on socket 0 00:06:44.726 EAL: Detected lcore 45 as core 16 on socket 0 00:06:44.726 EAL: Detected lcore 46 as core 17 on socket 0 00:06:44.726 EAL: Detected lcore 47 as core 18 on socket 0 00:06:44.726 EAL: Detected lcore 48 as core 19 on socket 0 00:06:44.726 EAL: Detected lcore 49 as core 20 on socket 0 00:06:44.726 EAL: Detected lcore 50 as core 24 on socket 0 00:06:44.726 EAL: Detected lcore 51 as core 25 on socket 0 00:06:44.726 EAL: Detected lcore 52 as core 26 on socket 0 00:06:44.726 EAL: Detected lcore 53 as core 27 on socket 0 00:06:44.726 EAL: Detected lcore 54 as core 0 on socket 1 00:06:44.726 EAL: Detected lcore 55 as core 1 on socket 1 00:06:44.726 EAL: Detected lcore 56 as core 2 on socket 1 00:06:44.726 EAL: Detected lcore 57 as core 3 on socket 1 00:06:44.726 EAL: Detected lcore 58 as core 4 on socket 1 00:06:44.726 EAL: Detected lcore 59 as core 8 on socket 1 00:06:44.726 EAL: Detected lcore 60 as core 9 on socket 1 00:06:44.726 EAL: Detected lcore 61 as core 10 on socket 1 00:06:44.726 EAL: Detected lcore 62 as core 11 on socket 1 00:06:44.726 EAL: Detected lcore 63 as core 16 on socket 1 00:06:44.726 EAL: Detected lcore 64 as core 17 on socket 1 00:06:44.726 EAL: Detected lcore 65 as core 18 on socket 1 00:06:44.726 EAL: Detected lcore 66 as core 19 on socket 1 00:06:44.726 EAL: Detected lcore 67 as core 20 on socket 1 00:06:44.726 EAL: Detected lcore 68 as core 24 on socket 1 00:06:44.726 EAL: Detected lcore 69 as core 25 on socket 1 00:06:44.726 EAL: Detected lcore 70 as core 26 on socket 1 00:06:44.726 EAL: Detected lcore 71 as core 27 on socket 1 00:06:44.726 EAL: Maximum logical cores by configuration: 128 00:06:44.726 EAL: Detected CPU lcores: 72 00:06:44.726 EAL: Detected NUMA nodes: 2 00:06:44.726 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:44.726 EAL: Detected shared linkage of DPDK 00:06:44.726 EAL: No shared files mode enabled, IPC will be disabled 00:06:44.726 EAL: No shared files mode enabled, IPC is disabled 00:06:44.726 EAL: PCI driver qat for device 0000:3d:01.0 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:01.1 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:01.2 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:01.3 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:01.4 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:01.5 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:01.6 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:01.7 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:02.0 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:02.1 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:02.2 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:02.3 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:02.4 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:02.5 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:02.6 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3d:02.7 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:01.0 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:01.1 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:01.2 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:01.3 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:01.4 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:01.5 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:01.6 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:01.7 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:02.0 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:02.1 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:02.2 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:02.3 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:02.4 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:02.5 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:02.6 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:3f:02.7 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:01.0 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:01.1 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:01.2 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:01.3 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:01.4 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:01.5 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:01.6 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:01.7 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:02.0 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:02.1 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:02.2 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:02.3 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:02.4 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:02.5 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:02.6 wants IOVA as 'PA' 00:06:44.726 EAL: PCI driver qat for device 0000:da:02.7 wants IOVA as 'PA' 00:06:44.726 EAL: Bus pci wants IOVA as 'PA' 00:06:44.726 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:44.726 EAL: Bus vdev wants IOVA as 'DC' 00:06:44.726 EAL: Selected IOVA mode 'PA' 00:06:44.726 EAL: Probing VFIO support... 00:06:44.726 EAL: IOMMU type 1 (Type 1) is supported 00:06:44.726 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:44.726 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:44.726 EAL: VFIO support initialized 00:06:44.726 EAL: Ask a virtual area of 0x2e000 bytes 00:06:44.726 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:44.726 EAL: Setting up physically contiguous memory... 00:06:44.726 EAL: Setting maximum number of open files to 524288 00:06:44.726 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:44.726 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:44.726 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:44.726 EAL: Ask a virtual area of 0x61000 bytes 00:06:44.726 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:44.726 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:44.726 EAL: Ask a virtual area of 0x400000000 bytes 00:06:44.726 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:44.726 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:44.726 EAL: Ask a virtual area of 0x61000 bytes 00:06:44.726 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:44.726 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:44.726 EAL: Ask a virtual area of 0x400000000 bytes 00:06:44.726 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:44.726 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:44.726 EAL: Ask a virtual area of 0x61000 bytes 00:06:44.726 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:44.726 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:44.726 EAL: Ask a virtual area of 0x400000000 bytes 00:06:44.726 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:44.726 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:44.726 EAL: Ask a virtual area of 0x61000 bytes 00:06:44.726 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:44.727 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:44.727 EAL: Ask a virtual area of 0x400000000 bytes 00:06:44.727 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:44.727 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:44.727 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:44.727 EAL: Ask a virtual area of 0x61000 bytes 00:06:44.727 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:44.727 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:44.727 EAL: Ask a virtual area of 0x400000000 bytes 00:06:44.727 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:44.727 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:44.727 EAL: Ask a virtual area of 0x61000 bytes 00:06:44.727 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:44.727 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:44.727 EAL: Ask a virtual area of 0x400000000 bytes 00:06:44.727 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:44.727 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:44.727 EAL: Ask a virtual area of 0x61000 bytes 00:06:44.727 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:44.727 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:44.727 EAL: Ask a virtual area of 0x400000000 bytes 00:06:44.727 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:44.727 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:44.727 EAL: Ask a virtual area of 0x61000 bytes 00:06:44.727 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:44.727 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:44.727 EAL: Ask a virtual area of 0x400000000 bytes 00:06:44.727 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:44.727 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:44.727 EAL: Hugepages will be freed exactly as allocated. 00:06:44.727 EAL: No shared files mode enabled, IPC is disabled 00:06:44.727 EAL: No shared files mode enabled, IPC is disabled 00:06:44.727 EAL: TSC frequency is ~2300000 KHz 00:06:44.727 EAL: Main lcore 0 is ready (tid=7f2adc738b00;cpuset=[0]) 00:06:44.727 EAL: Trying to obtain current memory policy. 00:06:44.727 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:44.987 EAL: Restoring previous memory policy: 0 00:06:44.987 EAL: request: mp_malloc_sync 00:06:44.987 EAL: No shared files mode enabled, IPC is disabled 00:06:44.987 EAL: Heap on socket 0 was expanded by 2MB 00:06:44.987 EAL: PCI device 0000:3d:01.0 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001000000 00:06:44.987 EAL: PCI memory mapped at 0x202001001000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:01.1 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001002000 00:06:44.987 EAL: PCI memory mapped at 0x202001003000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:01.2 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001004000 00:06:44.987 EAL: PCI memory mapped at 0x202001005000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:01.3 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001006000 00:06:44.987 EAL: PCI memory mapped at 0x202001007000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:01.4 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001008000 00:06:44.987 EAL: PCI memory mapped at 0x202001009000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:01.5 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x20200100a000 00:06:44.987 EAL: PCI memory mapped at 0x20200100b000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:01.6 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x20200100c000 00:06:44.987 EAL: PCI memory mapped at 0x20200100d000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:01.7 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x20200100e000 00:06:44.987 EAL: PCI memory mapped at 0x20200100f000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:02.0 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001010000 00:06:44.987 EAL: PCI memory mapped at 0x202001011000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:02.1 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001012000 00:06:44.987 EAL: PCI memory mapped at 0x202001013000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:02.2 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001014000 00:06:44.987 EAL: PCI memory mapped at 0x202001015000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:02.3 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001016000 00:06:44.987 EAL: PCI memory mapped at 0x202001017000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:02.4 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001018000 00:06:44.987 EAL: PCI memory mapped at 0x202001019000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:02.5 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x20200101a000 00:06:44.987 EAL: PCI memory mapped at 0x20200101b000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:02.6 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x20200101c000 00:06:44.987 EAL: PCI memory mapped at 0x20200101d000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:44.987 EAL: PCI device 0000:3d:02.7 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x20200101e000 00:06:44.987 EAL: PCI memory mapped at 0x20200101f000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:44.987 EAL: PCI device 0000:3f:01.0 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001020000 00:06:44.987 EAL: PCI memory mapped at 0x202001021000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:44.987 EAL: PCI device 0000:3f:01.1 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001022000 00:06:44.987 EAL: PCI memory mapped at 0x202001023000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:44.987 EAL: PCI device 0000:3f:01.2 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001024000 00:06:44.987 EAL: PCI memory mapped at 0x202001025000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:44.987 EAL: PCI device 0000:3f:01.3 on NUMA socket 0 00:06:44.987 EAL: probe driver: 8086:37c9 qat 00:06:44.987 EAL: PCI memory mapped at 0x202001026000 00:06:44.987 EAL: PCI memory mapped at 0x202001027000 00:06:44.987 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:01.4 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001028000 00:06:44.988 EAL: PCI memory mapped at 0x202001029000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:01.5 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200102a000 00:06:44.988 EAL: PCI memory mapped at 0x20200102b000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:01.6 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200102c000 00:06:44.988 EAL: PCI memory mapped at 0x20200102d000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:01.7 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200102e000 00:06:44.988 EAL: PCI memory mapped at 0x20200102f000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:02.0 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001030000 00:06:44.988 EAL: PCI memory mapped at 0x202001031000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:02.1 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001032000 00:06:44.988 EAL: PCI memory mapped at 0x202001033000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:02.2 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001034000 00:06:44.988 EAL: PCI memory mapped at 0x202001035000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:02.3 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001036000 00:06:44.988 EAL: PCI memory mapped at 0x202001037000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:02.4 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001038000 00:06:44.988 EAL: PCI memory mapped at 0x202001039000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:02.5 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200103a000 00:06:44.988 EAL: PCI memory mapped at 0x20200103b000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:02.6 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200103c000 00:06:44.988 EAL: PCI memory mapped at 0x20200103d000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:44.988 EAL: PCI device 0000:3f:02.7 on NUMA socket 0 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200103e000 00:06:44.988 EAL: PCI memory mapped at 0x20200103f000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:44.988 EAL: PCI device 0000:da:01.0 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001040000 00:06:44.988 EAL: PCI memory mapped at 0x202001041000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:44.988 EAL: Trying to obtain current memory policy. 00:06:44.988 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:44.988 EAL: Restoring previous memory policy: 4 00:06:44.988 EAL: request: mp_malloc_sync 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: Heap on socket 1 was expanded by 2MB 00:06:44.988 EAL: PCI device 0000:da:01.1 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001042000 00:06:44.988 EAL: PCI memory mapped at 0x202001043000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:01.2 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001044000 00:06:44.988 EAL: PCI memory mapped at 0x202001045000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:01.3 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001046000 00:06:44.988 EAL: PCI memory mapped at 0x202001047000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:01.4 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001048000 00:06:44.988 EAL: PCI memory mapped at 0x202001049000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:01.5 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200104a000 00:06:44.988 EAL: PCI memory mapped at 0x20200104b000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:01.6 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200104c000 00:06:44.988 EAL: PCI memory mapped at 0x20200104d000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:01.7 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200104e000 00:06:44.988 EAL: PCI memory mapped at 0x20200104f000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:02.0 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001050000 00:06:44.988 EAL: PCI memory mapped at 0x202001051000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:02.1 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001052000 00:06:44.988 EAL: PCI memory mapped at 0x202001053000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:02.2 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001054000 00:06:44.988 EAL: PCI memory mapped at 0x202001055000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:02.3 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001056000 00:06:44.988 EAL: PCI memory mapped at 0x202001057000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:02.4 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x202001058000 00:06:44.988 EAL: PCI memory mapped at 0x202001059000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:02.5 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200105a000 00:06:44.988 EAL: PCI memory mapped at 0x20200105b000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:02.6 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200105c000 00:06:44.988 EAL: PCI memory mapped at 0x20200105d000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:44.988 EAL: PCI device 0000:da:02.7 on NUMA socket 1 00:06:44.988 EAL: probe driver: 8086:37c9 qat 00:06:44.988 EAL: PCI memory mapped at 0x20200105e000 00:06:44.988 EAL: PCI memory mapped at 0x20200105f000 00:06:44.988 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:44.988 EAL: Mem event callback 'spdk:(nil)' registered 00:06:44.988 00:06:44.988 00:06:44.988 CUnit - A unit testing framework for C - Version 2.1-3 00:06:44.988 http://cunit.sourceforge.net/ 00:06:44.988 00:06:44.988 00:06:44.988 Suite: components_suite 00:06:44.988 Test: vtophys_malloc_test ...passed 00:06:44.988 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:44.988 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:44.988 EAL: Restoring previous memory policy: 4 00:06:44.988 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.988 EAL: request: mp_malloc_sync 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: Heap on socket 0 was expanded by 4MB 00:06:44.988 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.988 EAL: request: mp_malloc_sync 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: Heap on socket 0 was shrunk by 4MB 00:06:44.988 EAL: Trying to obtain current memory policy. 00:06:44.988 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:44.988 EAL: Restoring previous memory policy: 4 00:06:44.988 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.988 EAL: request: mp_malloc_sync 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: Heap on socket 0 was expanded by 6MB 00:06:44.988 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.988 EAL: request: mp_malloc_sync 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: Heap on socket 0 was shrunk by 6MB 00:06:44.988 EAL: Trying to obtain current memory policy. 00:06:44.988 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:44.988 EAL: Restoring previous memory policy: 4 00:06:44.988 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.988 EAL: request: mp_malloc_sync 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: Heap on socket 0 was expanded by 10MB 00:06:44.988 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.988 EAL: request: mp_malloc_sync 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: Heap on socket 0 was shrunk by 10MB 00:06:44.988 EAL: Trying to obtain current memory policy. 00:06:44.988 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:44.988 EAL: Restoring previous memory policy: 4 00:06:44.988 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.988 EAL: request: mp_malloc_sync 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.988 EAL: Heap on socket 0 was expanded by 18MB 00:06:44.988 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.988 EAL: request: mp_malloc_sync 00:06:44.988 EAL: No shared files mode enabled, IPC is disabled 00:06:44.989 EAL: Heap on socket 0 was shrunk by 18MB 00:06:44.989 EAL: Trying to obtain current memory policy. 00:06:44.989 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:44.989 EAL: Restoring previous memory policy: 4 00:06:44.989 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.989 EAL: request: mp_malloc_sync 00:06:44.989 EAL: No shared files mode enabled, IPC is disabled 00:06:44.989 EAL: Heap on socket 0 was expanded by 34MB 00:06:44.989 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.989 EAL: request: mp_malloc_sync 00:06:44.989 EAL: No shared files mode enabled, IPC is disabled 00:06:44.989 EAL: Heap on socket 0 was shrunk by 34MB 00:06:44.989 EAL: Trying to obtain current memory policy. 00:06:44.989 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:44.989 EAL: Restoring previous memory policy: 4 00:06:44.989 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.989 EAL: request: mp_malloc_sync 00:06:44.989 EAL: No shared files mode enabled, IPC is disabled 00:06:44.989 EAL: Heap on socket 0 was expanded by 66MB 00:06:44.989 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.989 EAL: request: mp_malloc_sync 00:06:44.989 EAL: No shared files mode enabled, IPC is disabled 00:06:44.989 EAL: Heap on socket 0 was shrunk by 66MB 00:06:44.989 EAL: Trying to obtain current memory policy. 00:06:44.989 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:44.989 EAL: Restoring previous memory policy: 4 00:06:44.989 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.989 EAL: request: mp_malloc_sync 00:06:44.989 EAL: No shared files mode enabled, IPC is disabled 00:06:44.989 EAL: Heap on socket 0 was expanded by 130MB 00:06:44.989 EAL: Calling mem event callback 'spdk:(nil)' 00:06:44.989 EAL: request: mp_malloc_sync 00:06:44.989 EAL: No shared files mode enabled, IPC is disabled 00:06:44.989 EAL: Heap on socket 0 was shrunk by 130MB 00:06:44.989 EAL: Trying to obtain current memory policy. 00:06:44.989 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:45.248 EAL: Restoring previous memory policy: 4 00:06:45.248 EAL: Calling mem event callback 'spdk:(nil)' 00:06:45.248 EAL: request: mp_malloc_sync 00:06:45.248 EAL: No shared files mode enabled, IPC is disabled 00:06:45.248 EAL: Heap on socket 0 was expanded by 258MB 00:06:45.248 EAL: Calling mem event callback 'spdk:(nil)' 00:06:45.248 EAL: request: mp_malloc_sync 00:06:45.248 EAL: No shared files mode enabled, IPC is disabled 00:06:45.248 EAL: Heap on socket 0 was shrunk by 258MB 00:06:45.248 EAL: Trying to obtain current memory policy. 00:06:45.248 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:45.248 EAL: Restoring previous memory policy: 4 00:06:45.248 EAL: Calling mem event callback 'spdk:(nil)' 00:06:45.248 EAL: request: mp_malloc_sync 00:06:45.248 EAL: No shared files mode enabled, IPC is disabled 00:06:45.248 EAL: Heap on socket 0 was expanded by 514MB 00:06:45.507 EAL: Calling mem event callback 'spdk:(nil)' 00:06:45.508 EAL: request: mp_malloc_sync 00:06:45.508 EAL: No shared files mode enabled, IPC is disabled 00:06:45.508 EAL: Heap on socket 0 was shrunk by 514MB 00:06:45.508 EAL: Trying to obtain current memory policy. 00:06:45.508 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:45.766 EAL: Restoring previous memory policy: 4 00:06:45.766 EAL: Calling mem event callback 'spdk:(nil)' 00:06:45.766 EAL: request: mp_malloc_sync 00:06:45.766 EAL: No shared files mode enabled, IPC is disabled 00:06:45.766 EAL: Heap on socket 0 was expanded by 1026MB 00:06:46.025 EAL: Calling mem event callback 'spdk:(nil)' 00:06:46.025 EAL: request: mp_malloc_sync 00:06:46.025 EAL: No shared files mode enabled, IPC is disabled 00:06:46.025 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:46.025 passed 00:06:46.025 00:06:46.025 Run Summary: Type Total Ran Passed Failed Inactive 00:06:46.025 suites 1 1 n/a 0 0 00:06:46.025 tests 2 2 2 0 0 00:06:46.025 asserts 6702 6702 6702 0 n/a 00:06:46.025 00:06:46.025 Elapsed time = 1.180 seconds 00:06:46.025 EAL: No shared files mode enabled, IPC is disabled 00:06:46.025 EAL: No shared files mode enabled, IPC is disabled 00:06:46.025 EAL: No shared files mode enabled, IPC is disabled 00:06:46.025 00:06:46.025 real 0m1.381s 00:06:46.025 user 0m0.769s 00:06:46.025 sys 0m0.580s 00:06:46.025 11:47:59 env.env_vtophys -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:46.025 11:47:59 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:46.025 ************************************ 00:06:46.025 END TEST env_vtophys 00:06:46.025 ************************************ 00:06:46.284 11:47:59 env -- common/autotest_common.sh@1142 -- # return 0 00:06:46.284 11:47:59 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:46.284 11:47:59 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:46.284 11:47:59 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.284 11:47:59 env -- common/autotest_common.sh@10 -- # set +x 00:06:46.284 ************************************ 00:06:46.284 START TEST env_pci 00:06:46.284 ************************************ 00:06:46.284 11:47:59 env.env_pci -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:46.284 00:06:46.284 00:06:46.284 CUnit - A unit testing framework for C - Version 2.1-3 00:06:46.284 http://cunit.sourceforge.net/ 00:06:46.284 00:06:46.284 00:06:46.284 Suite: pci 00:06:46.284 Test: pci_hook ...[2024-07-15 11:47:59.720653] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 1407019 has claimed it 00:06:46.284 EAL: Cannot find device (10000:00:01.0) 00:06:46.284 EAL: Failed to attach device on primary process 00:06:46.284 passed 00:06:46.284 00:06:46.284 Run Summary: Type Total Ran Passed Failed Inactive 00:06:46.284 suites 1 1 n/a 0 0 00:06:46.284 tests 1 1 1 0 0 00:06:46.284 asserts 25 25 25 0 n/a 00:06:46.284 00:06:46.284 Elapsed time = 0.042 seconds 00:06:46.284 00:06:46.284 real 0m0.070s 00:06:46.284 user 0m0.026s 00:06:46.284 sys 0m0.044s 00:06:46.284 11:47:59 env.env_pci -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:46.284 11:47:59 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:46.284 ************************************ 00:06:46.284 END TEST env_pci 00:06:46.284 ************************************ 00:06:46.284 11:47:59 env -- common/autotest_common.sh@1142 -- # return 0 00:06:46.284 11:47:59 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:46.284 11:47:59 env -- env/env.sh@15 -- # uname 00:06:46.284 11:47:59 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:46.284 11:47:59 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:46.284 11:47:59 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:46.284 11:47:59 env -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:06:46.284 11:47:59 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:46.284 11:47:59 env -- common/autotest_common.sh@10 -- # set +x 00:06:46.284 ************************************ 00:06:46.284 START TEST env_dpdk_post_init 00:06:46.284 ************************************ 00:06:46.284 11:47:59 env.env_dpdk_post_init -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:46.544 EAL: Detected CPU lcores: 72 00:06:46.544 EAL: Detected NUMA nodes: 2 00:06:46.544 EAL: Detected shared linkage of DPDK 00:06:46.544 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:46.544 EAL: Selected IOVA mode 'PA' 00:06:46.544 EAL: VFIO support initialized 00:06:46.544 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.544 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.544 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.544 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.544 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.544 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.544 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.544 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.544 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.544 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:46.544 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.545 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.545 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:46.545 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:46.546 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:46.546 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:46.546 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:46.546 EAL: Using IOMMU type 1 (Type 1) 00:06:46.546 EAL: Ignore mapping IO port bar(1) 00:06:46.546 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:06:46.546 EAL: Ignore mapping IO port bar(1) 00:06:46.546 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:06:46.546 EAL: Ignore mapping IO port bar(1) 00:06:46.546 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:06:46.546 EAL: Ignore mapping IO port bar(1) 00:06:46.546 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:06:46.546 EAL: Ignore mapping IO port bar(1) 00:06:46.546 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:06:46.804 EAL: Ignore mapping IO port bar(1) 00:06:46.804 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:06:46.804 EAL: Ignore mapping IO port bar(1) 00:06:46.804 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:06:46.804 EAL: Ignore mapping IO port bar(1) 00:06:46.804 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:06:47.371 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:5e:00.0 (socket 0) 00:06:47.371 EAL: Ignore mapping IO port bar(1) 00:06:47.371 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:06:47.371 EAL: Ignore mapping IO port bar(1) 00:06:47.371 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:06:47.371 EAL: Ignore mapping IO port bar(1) 00:06:47.371 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:06:47.371 EAL: Ignore mapping IO port bar(1) 00:06:47.371 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:06:47.371 EAL: Ignore mapping IO port bar(1) 00:06:47.371 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:06:47.630 EAL: Ignore mapping IO port bar(1) 00:06:47.630 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:06:47.630 EAL: Ignore mapping IO port bar(1) 00:06:47.630 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:06:47.630 EAL: Ignore mapping IO port bar(1) 00:06:47.630 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:06:55.846 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:06:55.846 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:06:56.417 Starting DPDK initialization... 00:06:56.417 Starting SPDK post initialization... 00:06:56.417 SPDK NVMe probe 00:06:56.417 Attaching to 0000:5e:00.0 00:06:56.417 Attached to 0000:5e:00.0 00:06:56.417 Cleaning up... 00:06:56.417 00:06:56.417 real 0m9.929s 00:06:56.417 user 0m7.624s 00:06:56.417 sys 0m1.346s 00:06:56.417 11:48:09 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:56.417 11:48:09 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:56.417 ************************************ 00:06:56.417 END TEST env_dpdk_post_init 00:06:56.417 ************************************ 00:06:56.417 11:48:09 env -- common/autotest_common.sh@1142 -- # return 0 00:06:56.417 11:48:09 env -- env/env.sh@26 -- # uname 00:06:56.417 11:48:09 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:56.417 11:48:09 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:56.417 11:48:09 env -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:56.417 11:48:09 env -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.417 11:48:09 env -- common/autotest_common.sh@10 -- # set +x 00:06:56.417 ************************************ 00:06:56.417 START TEST env_mem_callbacks 00:06:56.417 ************************************ 00:06:56.417 11:48:09 env.env_mem_callbacks -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:56.417 EAL: Detected CPU lcores: 72 00:06:56.417 EAL: Detected NUMA nodes: 2 00:06:56.417 EAL: Detected shared linkage of DPDK 00:06:56.417 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:56.417 EAL: Selected IOVA mode 'PA' 00:06:56.417 EAL: VFIO support initialized 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.0 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.0_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.1 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.1_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.2 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.2_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.3 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.3_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.4 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.4_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.5 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.5_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.6 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.6_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:01.7 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:01.7_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.0 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.0_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.1 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.1_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.2 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.2_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.3 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.3_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.4 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.4_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.5 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.5_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.6 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.6_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3d:02.7 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3d:02.7_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3d:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.0 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.0_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.1 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.1_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.2 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.2_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.3 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.3_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.4 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.4_qat_sym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.417 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.5 (socket 0) 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_asym 00:06:56.417 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.417 CRYPTODEV: Creating cryptodev 0000:3f:01.5_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.6 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:01.6_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:01.7 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:01.7_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.0 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.0_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.1 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.1_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.2 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.2_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.3 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.3_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.4 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.4_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.5 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.5_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.6 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.6_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:3f:02.7 (socket 0) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:3f:02.7_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:3f:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.0 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.0_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.1 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.1_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.2 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.2_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.3 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.3_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.4 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.4_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.5 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.5_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.6 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.6_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:01.7 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:01.7_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.0 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.0_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.1 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.1_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.2 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.2_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.3 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.3_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.4 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.4_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.5 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.5_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.6 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.6_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:da:02.7 (socket 1) 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_asym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:56.418 CRYPTODEV: Creating cryptodev 0000:da:02.7_qat_sym 00:06:56.418 CRYPTODEV: Initialisation parameters - name: 0000:da:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:56.418 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:56.418 00:06:56.418 00:06:56.418 CUnit - A unit testing framework for C - Version 2.1-3 00:06:56.418 http://cunit.sourceforge.net/ 00:06:56.418 00:06:56.418 00:06:56.418 Suite: memory 00:06:56.418 Test: test ... 00:06:56.418 register 0x200000200000 2097152 00:06:56.418 register 0x201000a00000 2097152 00:06:56.418 malloc 3145728 00:06:56.418 register 0x200000400000 4194304 00:06:56.418 buf 0x200000500000 len 3145728 PASSED 00:06:56.418 malloc 64 00:06:56.418 buf 0x2000004fff40 len 64 PASSED 00:06:56.418 malloc 4194304 00:06:56.418 register 0x200000800000 6291456 00:06:56.418 buf 0x200000a00000 len 4194304 PASSED 00:06:56.418 free 0x200000500000 3145728 00:06:56.418 free 0x2000004fff40 64 00:06:56.418 unregister 0x200000400000 4194304 PASSED 00:06:56.418 free 0x200000a00000 4194304 00:06:56.418 unregister 0x200000800000 6291456 PASSED 00:06:56.418 malloc 8388608 00:06:56.418 register 0x200000400000 10485760 00:06:56.418 buf 0x200000600000 len 8388608 PASSED 00:06:56.418 free 0x200000600000 8388608 00:06:56.418 unregister 0x200000400000 10485760 PASSED 00:06:56.418 passed 00:06:56.418 00:06:56.418 Run Summary: Type Total Ran Passed Failed Inactive 00:06:56.419 suites 1 1 n/a 0 0 00:06:56.419 tests 1 1 1 0 0 00:06:56.419 asserts 16 16 16 0 n/a 00:06:56.419 00:06:56.419 Elapsed time = 0.007 seconds 00:06:56.419 00:06:56.419 real 0m0.110s 00:06:56.419 user 0m0.037s 00:06:56.419 sys 0m0.073s 00:06:56.419 11:48:09 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:56.419 11:48:09 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:56.419 ************************************ 00:06:56.419 END TEST env_mem_callbacks 00:06:56.419 ************************************ 00:06:56.678 11:48:10 env -- common/autotest_common.sh@1142 -- # return 0 00:06:56.678 00:06:56.678 real 0m12.244s 00:06:56.678 user 0m8.851s 00:06:56.678 sys 0m2.445s 00:06:56.678 11:48:10 env -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:56.678 11:48:10 env -- common/autotest_common.sh@10 -- # set +x 00:06:56.678 ************************************ 00:06:56.678 END TEST env 00:06:56.678 ************************************ 00:06:56.678 11:48:10 -- common/autotest_common.sh@1142 -- # return 0 00:06:56.678 11:48:10 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:56.678 11:48:10 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:56.678 11:48:10 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:56.678 11:48:10 -- common/autotest_common.sh@10 -- # set +x 00:06:56.678 ************************************ 00:06:56.678 START TEST rpc 00:06:56.678 ************************************ 00:06:56.678 11:48:10 rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:56.678 * Looking for test storage... 00:06:56.678 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:56.678 11:48:10 rpc -- rpc/rpc.sh@65 -- # spdk_pid=1409051 00:06:56.678 11:48:10 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:56.678 11:48:10 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:56.678 11:48:10 rpc -- rpc/rpc.sh@67 -- # waitforlisten 1409051 00:06:56.678 11:48:10 rpc -- common/autotest_common.sh@829 -- # '[' -z 1409051 ']' 00:06:56.678 11:48:10 rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:56.678 11:48:10 rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:06:56.678 11:48:10 rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:56.678 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:56.678 11:48:10 rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:06:56.678 11:48:10 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:56.937 [2024-07-15 11:48:10.296587] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:06:56.937 [2024-07-15 11:48:10.296663] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1409051 ] 00:06:56.937 [2024-07-15 11:48:10.426049] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:56.937 [2024-07-15 11:48:10.530967] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:56.937 [2024-07-15 11:48:10.531007] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 1409051' to capture a snapshot of events at runtime. 00:06:56.937 [2024-07-15 11:48:10.531021] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:56.937 [2024-07-15 11:48:10.531034] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:56.937 [2024-07-15 11:48:10.531044] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid1409051 for offline analysis/debug. 00:06:56.937 [2024-07-15 11:48:10.531080] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.874 11:48:11 rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:06:57.874 11:48:11 rpc -- common/autotest_common.sh@862 -- # return 0 00:06:57.874 11:48:11 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:57.874 11:48:11 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:57.874 11:48:11 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:57.874 11:48:11 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:57.874 11:48:11 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:57.874 11:48:11 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:57.874 11:48:11 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:57.874 ************************************ 00:06:57.874 START TEST rpc_integrity 00:06:57.874 ************************************ 00:06:57.874 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:57.874 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:57.874 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.874 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:57.874 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.874 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:57.874 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:57.875 { 00:06:57.875 "name": "Malloc0", 00:06:57.875 "aliases": [ 00:06:57.875 "3fc46fbf-5498-4f50-864a-80898650a47c" 00:06:57.875 ], 00:06:57.875 "product_name": "Malloc disk", 00:06:57.875 "block_size": 512, 00:06:57.875 "num_blocks": 16384, 00:06:57.875 "uuid": "3fc46fbf-5498-4f50-864a-80898650a47c", 00:06:57.875 "assigned_rate_limits": { 00:06:57.875 "rw_ios_per_sec": 0, 00:06:57.875 "rw_mbytes_per_sec": 0, 00:06:57.875 "r_mbytes_per_sec": 0, 00:06:57.875 "w_mbytes_per_sec": 0 00:06:57.875 }, 00:06:57.875 "claimed": false, 00:06:57.875 "zoned": false, 00:06:57.875 "supported_io_types": { 00:06:57.875 "read": true, 00:06:57.875 "write": true, 00:06:57.875 "unmap": true, 00:06:57.875 "flush": true, 00:06:57.875 "reset": true, 00:06:57.875 "nvme_admin": false, 00:06:57.875 "nvme_io": false, 00:06:57.875 "nvme_io_md": false, 00:06:57.875 "write_zeroes": true, 00:06:57.875 "zcopy": true, 00:06:57.875 "get_zone_info": false, 00:06:57.875 "zone_management": false, 00:06:57.875 "zone_append": false, 00:06:57.875 "compare": false, 00:06:57.875 "compare_and_write": false, 00:06:57.875 "abort": true, 00:06:57.875 "seek_hole": false, 00:06:57.875 "seek_data": false, 00:06:57.875 "copy": true, 00:06:57.875 "nvme_iov_md": false 00:06:57.875 }, 00:06:57.875 "memory_domains": [ 00:06:57.875 { 00:06:57.875 "dma_device_id": "system", 00:06:57.875 "dma_device_type": 1 00:06:57.875 }, 00:06:57.875 { 00:06:57.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:57.875 "dma_device_type": 2 00:06:57.875 } 00:06:57.875 ], 00:06:57.875 "driver_specific": {} 00:06:57.875 } 00:06:57.875 ]' 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:57.875 [2024-07-15 11:48:11.412456] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:57.875 [2024-07-15 11:48:11.412497] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:57.875 [2024-07-15 11:48:11.412516] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19ce670 00:06:57.875 [2024-07-15 11:48:11.412528] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:57.875 [2024-07-15 11:48:11.414125] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:57.875 [2024-07-15 11:48:11.414153] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:57.875 Passthru0 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:57.875 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:57.875 { 00:06:57.875 "name": "Malloc0", 00:06:57.875 "aliases": [ 00:06:57.875 "3fc46fbf-5498-4f50-864a-80898650a47c" 00:06:57.875 ], 00:06:57.875 "product_name": "Malloc disk", 00:06:57.875 "block_size": 512, 00:06:57.875 "num_blocks": 16384, 00:06:57.875 "uuid": "3fc46fbf-5498-4f50-864a-80898650a47c", 00:06:57.875 "assigned_rate_limits": { 00:06:57.875 "rw_ios_per_sec": 0, 00:06:57.875 "rw_mbytes_per_sec": 0, 00:06:57.875 "r_mbytes_per_sec": 0, 00:06:57.875 "w_mbytes_per_sec": 0 00:06:57.875 }, 00:06:57.875 "claimed": true, 00:06:57.875 "claim_type": "exclusive_write", 00:06:57.875 "zoned": false, 00:06:57.875 "supported_io_types": { 00:06:57.875 "read": true, 00:06:57.875 "write": true, 00:06:57.875 "unmap": true, 00:06:57.875 "flush": true, 00:06:57.875 "reset": true, 00:06:57.875 "nvme_admin": false, 00:06:57.875 "nvme_io": false, 00:06:57.875 "nvme_io_md": false, 00:06:57.875 "write_zeroes": true, 00:06:57.875 "zcopy": true, 00:06:57.875 "get_zone_info": false, 00:06:57.875 "zone_management": false, 00:06:57.875 "zone_append": false, 00:06:57.875 "compare": false, 00:06:57.875 "compare_and_write": false, 00:06:57.875 "abort": true, 00:06:57.875 "seek_hole": false, 00:06:57.875 "seek_data": false, 00:06:57.875 "copy": true, 00:06:57.875 "nvme_iov_md": false 00:06:57.875 }, 00:06:57.875 "memory_domains": [ 00:06:57.875 { 00:06:57.875 "dma_device_id": "system", 00:06:57.875 "dma_device_type": 1 00:06:57.875 }, 00:06:57.875 { 00:06:57.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:57.875 "dma_device_type": 2 00:06:57.875 } 00:06:57.875 ], 00:06:57.875 "driver_specific": {} 00:06:57.875 }, 00:06:57.875 { 00:06:57.875 "name": "Passthru0", 00:06:57.875 "aliases": [ 00:06:57.875 "25dc3b8a-64e4-5d05-8a4e-cd1f7363ab3e" 00:06:57.875 ], 00:06:57.875 "product_name": "passthru", 00:06:57.875 "block_size": 512, 00:06:57.875 "num_blocks": 16384, 00:06:57.875 "uuid": "25dc3b8a-64e4-5d05-8a4e-cd1f7363ab3e", 00:06:57.875 "assigned_rate_limits": { 00:06:57.875 "rw_ios_per_sec": 0, 00:06:57.875 "rw_mbytes_per_sec": 0, 00:06:57.875 "r_mbytes_per_sec": 0, 00:06:57.875 "w_mbytes_per_sec": 0 00:06:57.875 }, 00:06:57.875 "claimed": false, 00:06:57.875 "zoned": false, 00:06:57.875 "supported_io_types": { 00:06:57.875 "read": true, 00:06:57.875 "write": true, 00:06:57.875 "unmap": true, 00:06:57.875 "flush": true, 00:06:57.875 "reset": true, 00:06:57.875 "nvme_admin": false, 00:06:57.875 "nvme_io": false, 00:06:57.875 "nvme_io_md": false, 00:06:57.875 "write_zeroes": true, 00:06:57.875 "zcopy": true, 00:06:57.875 "get_zone_info": false, 00:06:57.875 "zone_management": false, 00:06:57.875 "zone_append": false, 00:06:57.875 "compare": false, 00:06:57.875 "compare_and_write": false, 00:06:57.875 "abort": true, 00:06:57.875 "seek_hole": false, 00:06:57.875 "seek_data": false, 00:06:57.875 "copy": true, 00:06:57.875 "nvme_iov_md": false 00:06:57.875 }, 00:06:57.875 "memory_domains": [ 00:06:57.875 { 00:06:57.875 "dma_device_id": "system", 00:06:57.875 "dma_device_type": 1 00:06:57.875 }, 00:06:57.875 { 00:06:57.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:57.875 "dma_device_type": 2 00:06:57.875 } 00:06:57.875 ], 00:06:57.875 "driver_specific": { 00:06:57.875 "passthru": { 00:06:57.875 "name": "Passthru0", 00:06:57.875 "base_bdev_name": "Malloc0" 00:06:57.875 } 00:06:57.875 } 00:06:57.875 } 00:06:57.875 ]' 00:06:57.875 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:58.156 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:58.156 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.156 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.156 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.156 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:58.156 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:58.156 11:48:11 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:58.156 00:06:58.156 real 0m0.315s 00:06:58.156 user 0m0.191s 00:06:58.156 sys 0m0.055s 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.156 11:48:11 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.156 ************************************ 00:06:58.156 END TEST rpc_integrity 00:06:58.156 ************************************ 00:06:58.156 11:48:11 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:58.156 11:48:11 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:58.156 11:48:11 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:58.156 11:48:11 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.156 11:48:11 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:58.156 ************************************ 00:06:58.156 START TEST rpc_plugins 00:06:58.156 ************************************ 00:06:58.156 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@1123 -- # rpc_plugins 00:06:58.156 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:58.156 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.156 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:58.156 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.156 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:58.156 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:58.156 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.156 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:58.156 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.156 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:58.156 { 00:06:58.156 "name": "Malloc1", 00:06:58.156 "aliases": [ 00:06:58.156 "aa9932c3-dbc2-464c-912e-7c9b859299be" 00:06:58.156 ], 00:06:58.156 "product_name": "Malloc disk", 00:06:58.156 "block_size": 4096, 00:06:58.156 "num_blocks": 256, 00:06:58.156 "uuid": "aa9932c3-dbc2-464c-912e-7c9b859299be", 00:06:58.156 "assigned_rate_limits": { 00:06:58.156 "rw_ios_per_sec": 0, 00:06:58.156 "rw_mbytes_per_sec": 0, 00:06:58.156 "r_mbytes_per_sec": 0, 00:06:58.156 "w_mbytes_per_sec": 0 00:06:58.156 }, 00:06:58.156 "claimed": false, 00:06:58.156 "zoned": false, 00:06:58.156 "supported_io_types": { 00:06:58.156 "read": true, 00:06:58.156 "write": true, 00:06:58.156 "unmap": true, 00:06:58.156 "flush": true, 00:06:58.156 "reset": true, 00:06:58.156 "nvme_admin": false, 00:06:58.156 "nvme_io": false, 00:06:58.157 "nvme_io_md": false, 00:06:58.157 "write_zeroes": true, 00:06:58.157 "zcopy": true, 00:06:58.157 "get_zone_info": false, 00:06:58.157 "zone_management": false, 00:06:58.157 "zone_append": false, 00:06:58.157 "compare": false, 00:06:58.157 "compare_and_write": false, 00:06:58.157 "abort": true, 00:06:58.157 "seek_hole": false, 00:06:58.157 "seek_data": false, 00:06:58.157 "copy": true, 00:06:58.157 "nvme_iov_md": false 00:06:58.157 }, 00:06:58.157 "memory_domains": [ 00:06:58.157 { 00:06:58.157 "dma_device_id": "system", 00:06:58.157 "dma_device_type": 1 00:06:58.157 }, 00:06:58.157 { 00:06:58.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:58.157 "dma_device_type": 2 00:06:58.157 } 00:06:58.157 ], 00:06:58.157 "driver_specific": {} 00:06:58.157 } 00:06:58.157 ]' 00:06:58.157 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:58.157 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:58.157 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:58.157 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.157 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:58.416 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.416 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:58.416 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.416 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:58.416 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.416 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:58.416 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:58.416 11:48:11 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:58.416 00:06:58.416 real 0m0.153s 00:06:58.416 user 0m0.093s 00:06:58.416 sys 0m0.028s 00:06:58.416 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.416 11:48:11 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:58.416 ************************************ 00:06:58.416 END TEST rpc_plugins 00:06:58.416 ************************************ 00:06:58.416 11:48:11 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:58.416 11:48:11 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:58.416 11:48:11 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:58.416 11:48:11 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.416 11:48:11 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:58.416 ************************************ 00:06:58.416 START TEST rpc_trace_cmd_test 00:06:58.416 ************************************ 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1123 -- # rpc_trace_cmd_test 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:58.416 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid1409051", 00:06:58.416 "tpoint_group_mask": "0x8", 00:06:58.416 "iscsi_conn": { 00:06:58.416 "mask": "0x2", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "scsi": { 00:06:58.416 "mask": "0x4", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "bdev": { 00:06:58.416 "mask": "0x8", 00:06:58.416 "tpoint_mask": "0xffffffffffffffff" 00:06:58.416 }, 00:06:58.416 "nvmf_rdma": { 00:06:58.416 "mask": "0x10", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "nvmf_tcp": { 00:06:58.416 "mask": "0x20", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "ftl": { 00:06:58.416 "mask": "0x40", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "blobfs": { 00:06:58.416 "mask": "0x80", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "dsa": { 00:06:58.416 "mask": "0x200", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "thread": { 00:06:58.416 "mask": "0x400", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "nvme_pcie": { 00:06:58.416 "mask": "0x800", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "iaa": { 00:06:58.416 "mask": "0x1000", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "nvme_tcp": { 00:06:58.416 "mask": "0x2000", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "bdev_nvme": { 00:06:58.416 "mask": "0x4000", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 }, 00:06:58.416 "sock": { 00:06:58.416 "mask": "0x8000", 00:06:58.416 "tpoint_mask": "0x0" 00:06:58.416 } 00:06:58.416 }' 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:58.416 11:48:11 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:58.417 11:48:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:58.676 11:48:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:58.676 11:48:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:58.676 11:48:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:58.676 11:48:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:58.676 11:48:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:58.676 11:48:12 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:58.677 00:06:58.677 real 0m0.244s 00:06:58.677 user 0m0.203s 00:06:58.677 sys 0m0.034s 00:06:58.677 11:48:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.677 11:48:12 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:58.677 ************************************ 00:06:58.677 END TEST rpc_trace_cmd_test 00:06:58.677 ************************************ 00:06:58.677 11:48:12 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:58.677 11:48:12 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:58.677 11:48:12 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:58.677 11:48:12 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:58.677 11:48:12 rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:58.677 11:48:12 rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:58.677 11:48:12 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:58.677 ************************************ 00:06:58.677 START TEST rpc_daemon_integrity 00:06:58.677 ************************************ 00:06:58.677 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1123 -- # rpc_integrity 00:06:58.677 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:58.677 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.677 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.677 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.677 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:58.677 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:58.937 { 00:06:58.937 "name": "Malloc2", 00:06:58.937 "aliases": [ 00:06:58.937 "97abef3c-982a-4bd8-a90d-96cdc9bbf548" 00:06:58.937 ], 00:06:58.937 "product_name": "Malloc disk", 00:06:58.937 "block_size": 512, 00:06:58.937 "num_blocks": 16384, 00:06:58.937 "uuid": "97abef3c-982a-4bd8-a90d-96cdc9bbf548", 00:06:58.937 "assigned_rate_limits": { 00:06:58.937 "rw_ios_per_sec": 0, 00:06:58.937 "rw_mbytes_per_sec": 0, 00:06:58.937 "r_mbytes_per_sec": 0, 00:06:58.937 "w_mbytes_per_sec": 0 00:06:58.937 }, 00:06:58.937 "claimed": false, 00:06:58.937 "zoned": false, 00:06:58.937 "supported_io_types": { 00:06:58.937 "read": true, 00:06:58.937 "write": true, 00:06:58.937 "unmap": true, 00:06:58.937 "flush": true, 00:06:58.937 "reset": true, 00:06:58.937 "nvme_admin": false, 00:06:58.937 "nvme_io": false, 00:06:58.937 "nvme_io_md": false, 00:06:58.937 "write_zeroes": true, 00:06:58.937 "zcopy": true, 00:06:58.937 "get_zone_info": false, 00:06:58.937 "zone_management": false, 00:06:58.937 "zone_append": false, 00:06:58.937 "compare": false, 00:06:58.937 "compare_and_write": false, 00:06:58.937 "abort": true, 00:06:58.937 "seek_hole": false, 00:06:58.937 "seek_data": false, 00:06:58.937 "copy": true, 00:06:58.937 "nvme_iov_md": false 00:06:58.937 }, 00:06:58.937 "memory_domains": [ 00:06:58.937 { 00:06:58.937 "dma_device_id": "system", 00:06:58.937 "dma_device_type": 1 00:06:58.937 }, 00:06:58.937 { 00:06:58.937 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:58.937 "dma_device_type": 2 00:06:58.937 } 00:06:58.937 ], 00:06:58.937 "driver_specific": {} 00:06:58.937 } 00:06:58.937 ]' 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.937 [2024-07-15 11:48:12.363175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:58.937 [2024-07-15 11:48:12.363212] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:58.937 [2024-07-15 11:48:12.363230] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b782c0 00:06:58.937 [2024-07-15 11:48:12.363242] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:58.937 [2024-07-15 11:48:12.364619] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:58.937 [2024-07-15 11:48:12.364645] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:58.937 Passthru0 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:58.937 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:58.938 { 00:06:58.938 "name": "Malloc2", 00:06:58.938 "aliases": [ 00:06:58.938 "97abef3c-982a-4bd8-a90d-96cdc9bbf548" 00:06:58.938 ], 00:06:58.938 "product_name": "Malloc disk", 00:06:58.938 "block_size": 512, 00:06:58.938 "num_blocks": 16384, 00:06:58.938 "uuid": "97abef3c-982a-4bd8-a90d-96cdc9bbf548", 00:06:58.938 "assigned_rate_limits": { 00:06:58.938 "rw_ios_per_sec": 0, 00:06:58.938 "rw_mbytes_per_sec": 0, 00:06:58.938 "r_mbytes_per_sec": 0, 00:06:58.938 "w_mbytes_per_sec": 0 00:06:58.938 }, 00:06:58.938 "claimed": true, 00:06:58.938 "claim_type": "exclusive_write", 00:06:58.938 "zoned": false, 00:06:58.938 "supported_io_types": { 00:06:58.938 "read": true, 00:06:58.938 "write": true, 00:06:58.938 "unmap": true, 00:06:58.938 "flush": true, 00:06:58.938 "reset": true, 00:06:58.938 "nvme_admin": false, 00:06:58.938 "nvme_io": false, 00:06:58.938 "nvme_io_md": false, 00:06:58.938 "write_zeroes": true, 00:06:58.938 "zcopy": true, 00:06:58.938 "get_zone_info": false, 00:06:58.938 "zone_management": false, 00:06:58.938 "zone_append": false, 00:06:58.938 "compare": false, 00:06:58.938 "compare_and_write": false, 00:06:58.938 "abort": true, 00:06:58.938 "seek_hole": false, 00:06:58.938 "seek_data": false, 00:06:58.938 "copy": true, 00:06:58.938 "nvme_iov_md": false 00:06:58.938 }, 00:06:58.938 "memory_domains": [ 00:06:58.938 { 00:06:58.938 "dma_device_id": "system", 00:06:58.938 "dma_device_type": 1 00:06:58.938 }, 00:06:58.938 { 00:06:58.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:58.938 "dma_device_type": 2 00:06:58.938 } 00:06:58.938 ], 00:06:58.938 "driver_specific": {} 00:06:58.938 }, 00:06:58.938 { 00:06:58.938 "name": "Passthru0", 00:06:58.938 "aliases": [ 00:06:58.938 "80e9f2c4-eeb7-526c-8fe1-7d735184a8e8" 00:06:58.938 ], 00:06:58.938 "product_name": "passthru", 00:06:58.938 "block_size": 512, 00:06:58.938 "num_blocks": 16384, 00:06:58.938 "uuid": "80e9f2c4-eeb7-526c-8fe1-7d735184a8e8", 00:06:58.938 "assigned_rate_limits": { 00:06:58.938 "rw_ios_per_sec": 0, 00:06:58.938 "rw_mbytes_per_sec": 0, 00:06:58.938 "r_mbytes_per_sec": 0, 00:06:58.938 "w_mbytes_per_sec": 0 00:06:58.938 }, 00:06:58.938 "claimed": false, 00:06:58.938 "zoned": false, 00:06:58.938 "supported_io_types": { 00:06:58.938 "read": true, 00:06:58.938 "write": true, 00:06:58.938 "unmap": true, 00:06:58.938 "flush": true, 00:06:58.938 "reset": true, 00:06:58.938 "nvme_admin": false, 00:06:58.938 "nvme_io": false, 00:06:58.938 "nvme_io_md": false, 00:06:58.938 "write_zeroes": true, 00:06:58.938 "zcopy": true, 00:06:58.938 "get_zone_info": false, 00:06:58.938 "zone_management": false, 00:06:58.938 "zone_append": false, 00:06:58.938 "compare": false, 00:06:58.938 "compare_and_write": false, 00:06:58.938 "abort": true, 00:06:58.938 "seek_hole": false, 00:06:58.938 "seek_data": false, 00:06:58.938 "copy": true, 00:06:58.938 "nvme_iov_md": false 00:06:58.938 }, 00:06:58.938 "memory_domains": [ 00:06:58.938 { 00:06:58.938 "dma_device_id": "system", 00:06:58.938 "dma_device_type": 1 00:06:58.938 }, 00:06:58.938 { 00:06:58.938 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:58.938 "dma_device_type": 2 00:06:58.938 } 00:06:58.938 ], 00:06:58.938 "driver_specific": { 00:06:58.938 "passthru": { 00:06:58.938 "name": "Passthru0", 00:06:58.938 "base_bdev_name": "Malloc2" 00:06:58.938 } 00:06:58.938 } 00:06:58.938 } 00:06:58.938 ]' 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@559 -- # xtrace_disable 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:58.938 00:06:58.938 real 0m0.303s 00:06:58.938 user 0m0.193s 00:06:58.938 sys 0m0.050s 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:58.938 11:48:12 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:58.938 ************************************ 00:06:58.938 END TEST rpc_daemon_integrity 00:06:58.938 ************************************ 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@1142 -- # return 0 00:06:59.198 11:48:12 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:59.198 11:48:12 rpc -- rpc/rpc.sh@84 -- # killprocess 1409051 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@948 -- # '[' -z 1409051 ']' 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@952 -- # kill -0 1409051 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@953 -- # uname 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1409051 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1409051' 00:06:59.198 killing process with pid 1409051 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@967 -- # kill 1409051 00:06:59.198 11:48:12 rpc -- common/autotest_common.sh@972 -- # wait 1409051 00:06:59.458 00:06:59.458 real 0m2.883s 00:06:59.458 user 0m3.703s 00:06:59.458 sys 0m0.894s 00:06:59.458 11:48:13 rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:06:59.458 11:48:13 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:59.458 ************************************ 00:06:59.458 END TEST rpc 00:06:59.458 ************************************ 00:06:59.458 11:48:13 -- common/autotest_common.sh@1142 -- # return 0 00:06:59.458 11:48:13 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:59.458 11:48:13 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:59.458 11:48:13 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:59.458 11:48:13 -- common/autotest_common.sh@10 -- # set +x 00:06:59.718 ************************************ 00:06:59.718 START TEST skip_rpc 00:06:59.718 ************************************ 00:06:59.718 11:48:13 skip_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:59.718 * Looking for test storage... 00:06:59.718 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:59.718 11:48:13 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:59.718 11:48:13 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:59.718 11:48:13 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:59.718 11:48:13 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:06:59.718 11:48:13 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:06:59.718 11:48:13 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:59.718 ************************************ 00:06:59.718 START TEST skip_rpc 00:06:59.718 ************************************ 00:06:59.718 11:48:13 skip_rpc.skip_rpc -- common/autotest_common.sh@1123 -- # test_skip_rpc 00:06:59.718 11:48:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=1409591 00:06:59.718 11:48:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:59.718 11:48:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:59.718 11:48:13 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:59.718 [2024-07-15 11:48:13.307349] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:06:59.718 [2024-07-15 11:48:13.307418] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1409591 ] 00:06:59.977 [2024-07-15 11:48:13.436537] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.977 [2024-07-15 11:48:13.538448] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@648 -- # local es=0 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # valid_exec_arg rpc_cmd spdk_get_version 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@636 -- # local arg=rpc_cmd 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # type -t rpc_cmd 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # rpc_cmd spdk_get_version 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # es=1 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 1409591 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@948 -- # '[' -z 1409591 ']' 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@952 -- # kill -0 1409591 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # uname 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1409591 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1409591' 00:07:05.254 killing process with pid 1409591 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # kill 1409591 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@972 -- # wait 1409591 00:07:05.254 00:07:05.254 real 0m5.455s 00:07:05.254 user 0m5.087s 00:07:05.254 sys 0m0.383s 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:05.254 11:48:18 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:05.254 ************************************ 00:07:05.254 END TEST skip_rpc 00:07:05.254 ************************************ 00:07:05.254 11:48:18 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:05.254 11:48:18 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:07:05.254 11:48:18 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:05.254 11:48:18 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:05.254 11:48:18 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:05.254 ************************************ 00:07:05.254 START TEST skip_rpc_with_json 00:07:05.254 ************************************ 00:07:05.254 11:48:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_json 00:07:05.254 11:48:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:07:05.254 11:48:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=1410362 00:07:05.254 11:48:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:05.254 11:48:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 1410362 00:07:05.254 11:48:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@829 -- # '[' -z 1410362 ']' 00:07:05.255 11:48:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:05.255 11:48:18 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:05.255 11:48:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:05.255 11:48:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:05.255 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:05.255 11:48:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:05.255 11:48:18 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:05.514 [2024-07-15 11:48:18.850435] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:07:05.514 [2024-07-15 11:48:18.850504] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1410362 ] 00:07:05.514 [2024-07-15 11:48:18.980251] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.514 [2024-07-15 11:48:19.086857] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.452 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:06.452 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@862 -- # return 0 00:07:06.452 11:48:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:07:06.452 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:06.452 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:06.712 [2024-07-15 11:48:20.052121] nvmf_rpc.c:2562:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:07:06.712 request: 00:07:06.712 { 00:07:06.712 "trtype": "tcp", 00:07:06.712 "method": "nvmf_get_transports", 00:07:06.712 "req_id": 1 00:07:06.712 } 00:07:06.712 Got JSON-RPC error response 00:07:06.712 response: 00:07:06.712 { 00:07:06.712 "code": -19, 00:07:06.712 "message": "No such device" 00:07:06.712 } 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 1 == 0 ]] 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:06.712 [2024-07-15 11:48:20.064283] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@559 -- # xtrace_disable 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:07:06.712 11:48:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:06.712 { 00:07:06.712 "subsystems": [ 00:07:06.712 { 00:07:06.712 "subsystem": "keyring", 00:07:06.712 "config": [] 00:07:06.712 }, 00:07:06.712 { 00:07:06.712 "subsystem": "iobuf", 00:07:06.712 "config": [ 00:07:06.712 { 00:07:06.712 "method": "iobuf_set_options", 00:07:06.712 "params": { 00:07:06.712 "small_pool_count": 8192, 00:07:06.712 "large_pool_count": 1024, 00:07:06.712 "small_bufsize": 8192, 00:07:06.712 "large_bufsize": 135168 00:07:06.712 } 00:07:06.712 } 00:07:06.712 ] 00:07:06.712 }, 00:07:06.712 { 00:07:06.712 "subsystem": "sock", 00:07:06.712 "config": [ 00:07:06.712 { 00:07:06.712 "method": "sock_set_default_impl", 00:07:06.712 "params": { 00:07:06.712 "impl_name": "posix" 00:07:06.712 } 00:07:06.712 }, 00:07:06.712 { 00:07:06.712 "method": "sock_impl_set_options", 00:07:06.712 "params": { 00:07:06.712 "impl_name": "ssl", 00:07:06.712 "recv_buf_size": 4096, 00:07:06.712 "send_buf_size": 4096, 00:07:06.712 "enable_recv_pipe": true, 00:07:06.712 "enable_quickack": false, 00:07:06.712 "enable_placement_id": 0, 00:07:06.713 "enable_zerocopy_send_server": true, 00:07:06.713 "enable_zerocopy_send_client": false, 00:07:06.713 "zerocopy_threshold": 0, 00:07:06.713 "tls_version": 0, 00:07:06.713 "enable_ktls": false 00:07:06.713 } 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "method": "sock_impl_set_options", 00:07:06.713 "params": { 00:07:06.713 "impl_name": "posix", 00:07:06.713 "recv_buf_size": 2097152, 00:07:06.713 "send_buf_size": 2097152, 00:07:06.713 "enable_recv_pipe": true, 00:07:06.713 "enable_quickack": false, 00:07:06.713 "enable_placement_id": 0, 00:07:06.713 "enable_zerocopy_send_server": true, 00:07:06.713 "enable_zerocopy_send_client": false, 00:07:06.713 "zerocopy_threshold": 0, 00:07:06.713 "tls_version": 0, 00:07:06.713 "enable_ktls": false 00:07:06.713 } 00:07:06.713 } 00:07:06.713 ] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "vmd", 00:07:06.713 "config": [] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "accel", 00:07:06.713 "config": [ 00:07:06.713 { 00:07:06.713 "method": "accel_set_options", 00:07:06.713 "params": { 00:07:06.713 "small_cache_size": 128, 00:07:06.713 "large_cache_size": 16, 00:07:06.713 "task_count": 2048, 00:07:06.713 "sequence_count": 2048, 00:07:06.713 "buf_count": 2048 00:07:06.713 } 00:07:06.713 } 00:07:06.713 ] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "bdev", 00:07:06.713 "config": [ 00:07:06.713 { 00:07:06.713 "method": "bdev_set_options", 00:07:06.713 "params": { 00:07:06.713 "bdev_io_pool_size": 65535, 00:07:06.713 "bdev_io_cache_size": 256, 00:07:06.713 "bdev_auto_examine": true, 00:07:06.713 "iobuf_small_cache_size": 128, 00:07:06.713 "iobuf_large_cache_size": 16 00:07:06.713 } 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "method": "bdev_raid_set_options", 00:07:06.713 "params": { 00:07:06.713 "process_window_size_kb": 1024 00:07:06.713 } 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "method": "bdev_iscsi_set_options", 00:07:06.713 "params": { 00:07:06.713 "timeout_sec": 30 00:07:06.713 } 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "method": "bdev_nvme_set_options", 00:07:06.713 "params": { 00:07:06.713 "action_on_timeout": "none", 00:07:06.713 "timeout_us": 0, 00:07:06.713 "timeout_admin_us": 0, 00:07:06.713 "keep_alive_timeout_ms": 10000, 00:07:06.713 "arbitration_burst": 0, 00:07:06.713 "low_priority_weight": 0, 00:07:06.713 "medium_priority_weight": 0, 00:07:06.713 "high_priority_weight": 0, 00:07:06.713 "nvme_adminq_poll_period_us": 10000, 00:07:06.713 "nvme_ioq_poll_period_us": 0, 00:07:06.713 "io_queue_requests": 0, 00:07:06.713 "delay_cmd_submit": true, 00:07:06.713 "transport_retry_count": 4, 00:07:06.713 "bdev_retry_count": 3, 00:07:06.713 "transport_ack_timeout": 0, 00:07:06.713 "ctrlr_loss_timeout_sec": 0, 00:07:06.713 "reconnect_delay_sec": 0, 00:07:06.713 "fast_io_fail_timeout_sec": 0, 00:07:06.713 "disable_auto_failback": false, 00:07:06.713 "generate_uuids": false, 00:07:06.713 "transport_tos": 0, 00:07:06.713 "nvme_error_stat": false, 00:07:06.713 "rdma_srq_size": 0, 00:07:06.713 "io_path_stat": false, 00:07:06.713 "allow_accel_sequence": false, 00:07:06.713 "rdma_max_cq_size": 0, 00:07:06.713 "rdma_cm_event_timeout_ms": 0, 00:07:06.713 "dhchap_digests": [ 00:07:06.713 "sha256", 00:07:06.713 "sha384", 00:07:06.713 "sha512" 00:07:06.713 ], 00:07:06.713 "dhchap_dhgroups": [ 00:07:06.713 "null", 00:07:06.713 "ffdhe2048", 00:07:06.713 "ffdhe3072", 00:07:06.713 "ffdhe4096", 00:07:06.713 "ffdhe6144", 00:07:06.713 "ffdhe8192" 00:07:06.713 ] 00:07:06.713 } 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "method": "bdev_nvme_set_hotplug", 00:07:06.713 "params": { 00:07:06.713 "period_us": 100000, 00:07:06.713 "enable": false 00:07:06.713 } 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "method": "bdev_wait_for_examine" 00:07:06.713 } 00:07:06.713 ] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "scsi", 00:07:06.713 "config": null 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "scheduler", 00:07:06.713 "config": [ 00:07:06.713 { 00:07:06.713 "method": "framework_set_scheduler", 00:07:06.713 "params": { 00:07:06.713 "name": "static" 00:07:06.713 } 00:07:06.713 } 00:07:06.713 ] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "vhost_scsi", 00:07:06.713 "config": [] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "vhost_blk", 00:07:06.713 "config": [] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "ublk", 00:07:06.713 "config": [] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "nbd", 00:07:06.713 "config": [] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "nvmf", 00:07:06.713 "config": [ 00:07:06.713 { 00:07:06.713 "method": "nvmf_set_config", 00:07:06.713 "params": { 00:07:06.713 "discovery_filter": "match_any", 00:07:06.713 "admin_cmd_passthru": { 00:07:06.713 "identify_ctrlr": false 00:07:06.713 } 00:07:06.713 } 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "method": "nvmf_set_max_subsystems", 00:07:06.713 "params": { 00:07:06.713 "max_subsystems": 1024 00:07:06.713 } 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "method": "nvmf_set_crdt", 00:07:06.713 "params": { 00:07:06.713 "crdt1": 0, 00:07:06.713 "crdt2": 0, 00:07:06.713 "crdt3": 0 00:07:06.713 } 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "method": "nvmf_create_transport", 00:07:06.713 "params": { 00:07:06.713 "trtype": "TCP", 00:07:06.713 "max_queue_depth": 128, 00:07:06.713 "max_io_qpairs_per_ctrlr": 127, 00:07:06.713 "in_capsule_data_size": 4096, 00:07:06.713 "max_io_size": 131072, 00:07:06.713 "io_unit_size": 131072, 00:07:06.713 "max_aq_depth": 128, 00:07:06.713 "num_shared_buffers": 511, 00:07:06.713 "buf_cache_size": 4294967295, 00:07:06.713 "dif_insert_or_strip": false, 00:07:06.713 "zcopy": false, 00:07:06.713 "c2h_success": true, 00:07:06.713 "sock_priority": 0, 00:07:06.713 "abort_timeout_sec": 1, 00:07:06.713 "ack_timeout": 0, 00:07:06.713 "data_wr_pool_size": 0 00:07:06.713 } 00:07:06.713 } 00:07:06.713 ] 00:07:06.713 }, 00:07:06.713 { 00:07:06.713 "subsystem": "iscsi", 00:07:06.713 "config": [ 00:07:06.713 { 00:07:06.713 "method": "iscsi_set_options", 00:07:06.713 "params": { 00:07:06.713 "node_base": "iqn.2016-06.io.spdk", 00:07:06.713 "max_sessions": 128, 00:07:06.713 "max_connections_per_session": 2, 00:07:06.713 "max_queue_depth": 64, 00:07:06.713 "default_time2wait": 2, 00:07:06.713 "default_time2retain": 20, 00:07:06.713 "first_burst_length": 8192, 00:07:06.713 "immediate_data": true, 00:07:06.713 "allow_duplicated_isid": false, 00:07:06.713 "error_recovery_level": 0, 00:07:06.713 "nop_timeout": 60, 00:07:06.713 "nop_in_interval": 30, 00:07:06.713 "disable_chap": false, 00:07:06.713 "require_chap": false, 00:07:06.713 "mutual_chap": false, 00:07:06.713 "chap_group": 0, 00:07:06.713 "max_large_datain_per_connection": 64, 00:07:06.713 "max_r2t_per_connection": 4, 00:07:06.713 "pdu_pool_size": 36864, 00:07:06.713 "immediate_data_pool_size": 16384, 00:07:06.713 "data_out_pool_size": 2048 00:07:06.713 } 00:07:06.713 } 00:07:06.713 ] 00:07:06.713 } 00:07:06.713 ] 00:07:06.713 } 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 1410362 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1410362 ']' 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1410362 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1410362 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1410362' 00:07:06.713 killing process with pid 1410362 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1410362 00:07:06.713 11:48:20 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1410362 00:07:07.283 11:48:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=1410666 00:07:07.283 11:48:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:07:07.283 11:48:20 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 1410666 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@948 -- # '[' -z 1410666 ']' 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@952 -- # kill -0 1410666 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # uname 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1410666 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1410666' 00:07:12.592 killing process with pid 1410666 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # kill 1410666 00:07:12.592 11:48:25 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@972 -- # wait 1410666 00:07:12.592 11:48:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:12.592 11:48:26 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:12.592 00:07:12.592 real 0m7.337s 00:07:12.592 user 0m7.183s 00:07:12.592 sys 0m0.976s 00:07:12.592 11:48:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.592 11:48:26 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:12.592 ************************************ 00:07:12.592 END TEST skip_rpc_with_json 00:07:12.592 ************************************ 00:07:12.592 11:48:26 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:12.592 11:48:26 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:12.592 11:48:26 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:12.592 11:48:26 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.592 11:48:26 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:12.851 ************************************ 00:07:12.851 START TEST skip_rpc_with_delay 00:07:12.851 ************************************ 00:07:12.851 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1123 -- # test_skip_rpc_with_delay 00:07:12.851 11:48:26 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:12.851 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@648 -- # local es=0 00:07:12.851 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:12.852 [2024-07-15 11:48:26.274753] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:12.852 [2024-07-15 11:48:26.274851] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # es=1 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:12.852 00:07:12.852 real 0m0.096s 00:07:12.852 user 0m0.066s 00:07:12.852 sys 0m0.029s 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:12.852 11:48:26 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:12.852 ************************************ 00:07:12.852 END TEST skip_rpc_with_delay 00:07:12.852 ************************************ 00:07:12.852 11:48:26 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:12.852 11:48:26 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:12.852 11:48:26 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:12.852 11:48:26 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:12.852 11:48:26 skip_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:12.852 11:48:26 skip_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:12.852 11:48:26 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:12.852 ************************************ 00:07:12.852 START TEST exit_on_failed_rpc_init 00:07:12.852 ************************************ 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1123 -- # test_exit_on_failed_rpc_init 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=1411421 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 1411421 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@829 -- # '[' -z 1411421 ']' 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:12.852 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:12.852 11:48:26 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:13.110 [2024-07-15 11:48:26.454813] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:07:13.110 [2024-07-15 11:48:26.454894] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1411421 ] 00:07:13.110 [2024-07-15 11:48:26.584883] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:13.110 [2024-07-15 11:48:26.685767] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@862 -- # return 0 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@648 -- # local es=0 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:14.048 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:14.048 [2024-07-15 11:48:27.455494] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:07:14.048 [2024-07-15 11:48:27.455562] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1411601 ] 00:07:14.048 [2024-07-15 11:48:27.589410] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.307 [2024-07-15 11:48:27.701070] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:14.307 [2024-07-15 11:48:27.701168] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:14.307 [2024-07-15 11:48:27.701190] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:14.307 [2024-07-15 11:48:27.701205] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # es=234 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # es=106 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # case "$es" in 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@668 -- # es=1 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 1411421 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@948 -- # '[' -z 1411421 ']' 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@952 -- # kill -0 1411421 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # uname 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1411421 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1411421' 00:07:14.307 killing process with pid 1411421 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # kill 1411421 00:07:14.307 11:48:27 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@972 -- # wait 1411421 00:07:14.876 00:07:14.876 real 0m1.874s 00:07:14.876 user 0m2.185s 00:07:14.876 sys 0m0.627s 00:07:14.876 11:48:28 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.876 11:48:28 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:14.876 ************************************ 00:07:14.876 END TEST exit_on_failed_rpc_init 00:07:14.876 ************************************ 00:07:14.876 11:48:28 skip_rpc -- common/autotest_common.sh@1142 -- # return 0 00:07:14.876 11:48:28 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:14.876 00:07:14.876 real 0m15.215s 00:07:14.876 user 0m14.692s 00:07:14.876 sys 0m2.333s 00:07:14.876 11:48:28 skip_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:14.876 11:48:28 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:14.876 ************************************ 00:07:14.876 END TEST skip_rpc 00:07:14.876 ************************************ 00:07:14.876 11:48:28 -- common/autotest_common.sh@1142 -- # return 0 00:07:14.876 11:48:28 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:14.876 11:48:28 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:14.876 11:48:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:14.876 11:48:28 -- common/autotest_common.sh@10 -- # set +x 00:07:14.876 ************************************ 00:07:14.876 START TEST rpc_client 00:07:14.876 ************************************ 00:07:14.876 11:48:28 rpc_client -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:15.136 * Looking for test storage... 00:07:15.136 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:07:15.136 11:48:28 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:15.136 OK 00:07:15.136 11:48:28 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:15.136 00:07:15.136 real 0m0.145s 00:07:15.136 user 0m0.056s 00:07:15.136 sys 0m0.100s 00:07:15.136 11:48:28 rpc_client -- common/autotest_common.sh@1124 -- # xtrace_disable 00:07:15.136 11:48:28 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:15.136 ************************************ 00:07:15.136 END TEST rpc_client 00:07:15.136 ************************************ 00:07:15.136 11:48:28 -- common/autotest_common.sh@1142 -- # return 0 00:07:15.136 11:48:28 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:07:15.136 11:48:28 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:07:15.136 11:48:28 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:07:15.136 11:48:28 -- common/autotest_common.sh@10 -- # set +x 00:07:15.136 ************************************ 00:07:15.136 START TEST json_config 00:07:15.136 ************************************ 00:07:15.136 11:48:28 json_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:005d867c-174e-e711-906e-0012795d9712 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=005d867c-174e-e711-906e-0012795d9712 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:15.136 11:48:28 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:15.136 11:48:28 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:15.136 11:48:28 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:15.136 11:48:28 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:15.136 11:48:28 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:15.136 11:48:28 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:15.136 11:48:28 json_config -- paths/export.sh@5 -- # export PATH 00:07:15.136 11:48:28 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@47 -- # : 0 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:15.136 11:48:28 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:07:15.136 INFO: JSON configuration test init 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:07:15.136 11:48:28 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:15.136 11:48:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:15.136 11:48:28 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:07:15.136 11:48:28 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:15.136 11:48:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:15.395 11:48:28 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:07:15.395 11:48:28 json_config -- json_config/common.sh@9 -- # local app=target 00:07:15.395 11:48:28 json_config -- json_config/common.sh@10 -- # shift 00:07:15.395 11:48:28 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:15.395 11:48:28 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:15.395 11:48:28 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:15.395 11:48:28 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:15.395 11:48:28 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:15.395 11:48:28 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1411883 00:07:15.395 11:48:28 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:15.395 Waiting for target to run... 00:07:15.395 11:48:28 json_config -- json_config/common.sh@25 -- # waitforlisten 1411883 /var/tmp/spdk_tgt.sock 00:07:15.395 11:48:28 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:07:15.395 11:48:28 json_config -- common/autotest_common.sh@829 -- # '[' -z 1411883 ']' 00:07:15.395 11:48:28 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:15.395 11:48:28 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:07:15.395 11:48:28 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:15.395 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:15.395 11:48:28 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:07:15.395 11:48:28 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:15.395 [2024-07-15 11:48:28.807237] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:07:15.395 [2024-07-15 11:48:28.807310] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1411883 ] 00:07:15.961 [2024-07-15 11:48:29.409247] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:15.961 [2024-07-15 11:48:29.516788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:16.218 11:48:29 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:07:16.218 11:48:29 json_config -- common/autotest_common.sh@862 -- # return 0 00:07:16.218 11:48:29 json_config -- json_config/common.sh@26 -- # echo '' 00:07:16.218 00:07:16.218 11:48:29 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:07:16.218 11:48:29 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:07:16.218 11:48:29 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:16.218 11:48:29 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:16.218 11:48:29 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:07:16.218 11:48:29 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:07:16.218 11:48:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:07:16.476 11:48:29 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:07:16.476 11:48:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:07:16.476 [2024-07-15 11:48:30.066617] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:16.733 11:48:30 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:07:16.733 11:48:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:07:16.733 [2024-07-15 11:48:30.231045] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:16.733 11:48:30 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:07:16.733 11:48:30 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:16.733 11:48:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:16.733 11:48:30 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:07:16.733 11:48:30 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:07:16.733 11:48:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:07:16.992 [2024-07-15 11:48:30.480368] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:07:22.266 11:48:35 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:22.266 11:48:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:07:22.266 11:48:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@48 -- # local get_types 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:07:22.266 11:48:35 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:07:22.266 11:48:35 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:07:22.266 11:48:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@55 -- # return 0 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:07:22.525 11:48:35 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:07:22.525 11:48:35 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:07:22.525 11:48:35 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:22.525 11:48:35 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:22.784 11:48:36 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:07:22.784 11:48:36 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:07:22.784 11:48:36 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:07:22.784 11:48:36 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:07:22.784 11:48:36 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:07:22.784 11:48:36 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:07:22.784 11:48:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:07:23.044 Nvme0n1p0 Nvme0n1p1 00:07:23.044 11:48:36 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:07:23.044 11:48:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:07:23.044 [2024-07-15 11:48:36.625640] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:23.044 [2024-07-15 11:48:36.625700] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:23.044 00:07:23.303 11:48:36 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:07:23.303 11:48:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:07:23.303 Malloc3 00:07:23.303 11:48:36 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:07:23.303 11:48:36 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:07:23.563 [2024-07-15 11:48:37.115023] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:23.563 [2024-07-15 11:48:37.115064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:23.563 [2024-07-15 11:48:37.115084] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27918c0 00:07:23.563 [2024-07-15 11:48:37.115097] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:23.563 [2024-07-15 11:48:37.116666] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:23.563 [2024-07-15 11:48:37.116699] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:23.563 PTBdevFromMalloc3 00:07:23.563 11:48:37 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:07:23.563 11:48:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:07:23.850 Null0 00:07:23.850 11:48:37 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:07:23.850 11:48:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:07:24.109 Malloc0 00:07:24.109 11:48:37 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:07:24.110 11:48:37 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:07:24.368 Malloc1 00:07:24.368 11:48:37 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:07:24.368 11:48:37 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:07:24.628 102400+0 records in 00:07:24.628 102400+0 records out 00:07:24.628 104857600 bytes (105 MB, 100 MiB) copied, 0.308313 s, 340 MB/s 00:07:24.628 11:48:38 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:07:24.628 11:48:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:07:24.887 aio_disk 00:07:24.887 11:48:38 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:07:24.887 11:48:38 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:24.887 11:48:38 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:08:03.677 edf83930-b193-4dee-b9d5-631f03541736 00:08:03.677 11:49:11 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:08:03.677 11:49:11 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:08:03.677 11:49:11 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:08:03.677 11:49:11 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:08:03.677 11:49:11 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:08:03.677 11:49:12 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:08:03.677 11:49:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:08:03.677 11:49:12 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:08:03.677 11:49:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:08:03.677 11:49:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:08:03.677 MallocForCryptoBdev 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@159 -- # wc -l 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:08:03.677 11:49:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:08:03.677 [2024-07-15 11:49:13.702092] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:08:03.677 CryptoMallocBdev 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:387de815-c416-44dc-a877-f3f644d1ada8 bdev_register:d593ae99-04c7-4e2d-b852-99420d979668 bdev_register:8f57adda-4110-4f67-91ca-d4561884b01b bdev_register:28ceec4d-94b8-42b7-ac8f-0966c0db9563 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:387de815-c416-44dc-a877-f3f644d1ada8 bdev_register:d593ae99-04c7-4e2d-b852-99420d979668 bdev_register:8f57adda-4110-4f67-91ca-d4561884b01b bdev_register:28ceec4d-94b8-42b7-ac8f-0966c0db9563 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@71 -- # sort 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@72 -- # sort 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:08:03.677 11:49:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.677 11:49:13 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.677 11:49:14 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:387de815-c416-44dc-a877-f3f644d1ada8 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:d593ae99-04c7-4e2d-b852-99420d979668 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:8f57adda-4110-4f67-91ca-d4561884b01b 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:28ceec4d-94b8-42b7-ac8f-0966c0db9563 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:28ceec4d-94b8-42b7-ac8f-0966c0db9563 bdev_register:387de815-c416-44dc-a877-f3f644d1ada8 bdev_register:8f57adda-4110-4f67-91ca-d4561884b01b bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:d593ae99-04c7-4e2d-b852-99420d979668 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\2\8\c\e\e\c\4\d\-\9\4\b\8\-\4\2\b\7\-\a\c\8\f\-\0\9\6\6\c\0\d\b\9\5\6\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\3\8\7\d\e\8\1\5\-\c\4\1\6\-\4\4\d\c\-\a\8\7\7\-\f\3\f\6\4\4\d\1\a\d\a\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\8\f\5\7\a\d\d\a\-\4\1\1\0\-\4\f\6\7\-\9\1\c\a\-\d\4\5\6\1\8\8\4\b\0\1\b\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\d\5\9\3\a\e\9\9\-\0\4\c\7\-\4\e\2\d\-\b\8\5\2\-\9\9\4\2\0\d\9\7\9\6\6\8\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@86 -- # cat 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:28ceec4d-94b8-42b7-ac8f-0966c0db9563 bdev_register:387de815-c416-44dc-a877-f3f644d1ada8 bdev_register:8f57adda-4110-4f67-91ca-d4561884b01b bdev_register:aio_disk bdev_register:CryptoMallocBdev bdev_register:d593ae99-04c7-4e2d-b852-99420d979668 bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:08:03.678 Expected events matched: 00:08:03.678 bdev_register:28ceec4d-94b8-42b7-ac8f-0966c0db9563 00:08:03.678 bdev_register:387de815-c416-44dc-a877-f3f644d1ada8 00:08:03.678 bdev_register:8f57adda-4110-4f67-91ca-d4561884b01b 00:08:03.678 bdev_register:aio_disk 00:08:03.678 bdev_register:CryptoMallocBdev 00:08:03.678 bdev_register:d593ae99-04c7-4e2d-b852-99420d979668 00:08:03.678 bdev_register:Malloc0 00:08:03.678 bdev_register:Malloc0p0 00:08:03.678 bdev_register:Malloc0p1 00:08:03.678 bdev_register:Malloc0p2 00:08:03.678 bdev_register:Malloc1 00:08:03.678 bdev_register:Malloc3 00:08:03.678 bdev_register:MallocForCryptoBdev 00:08:03.678 bdev_register:Null0 00:08:03.678 bdev_register:Nvme0n1 00:08:03.678 bdev_register:Nvme0n1p0 00:08:03.678 bdev_register:Nvme0n1p1 00:08:03.678 bdev_register:PTBdevFromMalloc3 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:08:03.678 11:49:14 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:03.678 11:49:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:08:03.678 11:49:14 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:03.678 11:49:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:08:03.678 11:49:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:08:03.678 MallocBdevForConfigChangeCheck 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:08:03.678 11:49:14 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:03.678 11:49:14 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:08:03.678 11:49:14 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:08:03.678 INFO: shutting down applications... 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:08:03.678 11:49:14 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:08:03.678 [2024-07-15 11:49:15.082374] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:08:08.955 Calling clear_iscsi_subsystem 00:08:08.955 Calling clear_nvmf_subsystem 00:08:08.955 Calling clear_nbd_subsystem 00:08:08.955 Calling clear_ublk_subsystem 00:08:08.955 Calling clear_vhost_blk_subsystem 00:08:08.955 Calling clear_vhost_scsi_subsystem 00:08:08.955 Calling clear_bdev_subsystem 00:08:08.955 11:49:22 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:08:08.955 11:49:22 json_config -- json_config/json_config.sh@343 -- # count=100 00:08:08.955 11:49:22 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:08:08.955 11:49:22 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:08.955 11:49:22 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:08:08.955 11:49:22 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:08:08.955 11:49:22 json_config -- json_config/json_config.sh@345 -- # break 00:08:08.955 11:49:22 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:08:08.955 11:49:22 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:08:08.955 11:49:22 json_config -- json_config/common.sh@31 -- # local app=target 00:08:08.955 11:49:22 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:08:08.955 11:49:22 json_config -- json_config/common.sh@35 -- # [[ -n 1411883 ]] 00:08:08.955 11:49:22 json_config -- json_config/common.sh@38 -- # kill -SIGINT 1411883 00:08:08.955 11:49:22 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:08:08.955 11:49:22 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:08.955 11:49:22 json_config -- json_config/common.sh@41 -- # kill -0 1411883 00:08:08.955 11:49:22 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:08:09.521 11:49:22 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:08:09.521 11:49:22 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:09.521 11:49:22 json_config -- json_config/common.sh@41 -- # kill -0 1411883 00:08:09.521 11:49:22 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:08:09.521 11:49:22 json_config -- json_config/common.sh@43 -- # break 00:08:09.521 11:49:22 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:08:09.521 11:49:22 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:08:09.521 SPDK target shutdown done 00:08:09.521 11:49:22 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:08:09.521 INFO: relaunching applications... 00:08:09.521 11:49:22 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:09.521 11:49:22 json_config -- json_config/common.sh@9 -- # local app=target 00:08:09.521 11:49:22 json_config -- json_config/common.sh@10 -- # shift 00:08:09.521 11:49:22 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:09.521 11:49:22 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:09.521 11:49:22 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:08:09.521 11:49:22 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:09.521 11:49:22 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:09.521 11:49:22 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=1419043 00:08:09.521 11:49:22 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:09.521 Waiting for target to run... 00:08:09.521 11:49:22 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:09.521 11:49:22 json_config -- json_config/common.sh@25 -- # waitforlisten 1419043 /var/tmp/spdk_tgt.sock 00:08:09.521 11:49:22 json_config -- common/autotest_common.sh@829 -- # '[' -z 1419043 ']' 00:08:09.521 11:49:22 json_config -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:09.521 11:49:22 json_config -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:09.521 11:49:22 json_config -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:09.521 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:09.521 11:49:22 json_config -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:09.521 11:49:22 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:09.521 [2024-07-15 11:49:23.008970] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:09.521 [2024-07-15 11:49:23.009051] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1419043 ] 00:08:10.087 [2024-07-15 11:49:23.582114] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.346 [2024-07-15 11:49:23.692553] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.346 [2024-07-15 11:49:23.746738] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:08:10.346 [2024-07-15 11:49:23.754776] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:08:10.346 [2024-07-15 11:49:23.762793] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:08:10.346 [2024-07-15 11:49:23.844124] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:08:12.899 [2024-07-15 11:49:26.051078] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:12.899 [2024-07-15 11:49:26.051140] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:12.899 [2024-07-15 11:49:26.051154] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:12.899 [2024-07-15 11:49:26.059094] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:08:12.899 [2024-07-15 11:49:26.059120] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:08:12.899 [2024-07-15 11:49:26.067106] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:12.899 [2024-07-15 11:49:26.067130] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:08:12.899 [2024-07-15 11:49:26.075144] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:08:12.899 [2024-07-15 11:49:26.075169] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:08:12.899 [2024-07-15 11:49:26.075182] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:15.434 [2024-07-15 11:49:28.958969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:15.434 [2024-07-15 11:49:28.959020] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:15.434 [2024-07-15 11:49:28.959040] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19336e0 00:08:15.434 [2024-07-15 11:49:28.959056] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:15.434 [2024-07-15 11:49:28.959348] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:15.434 [2024-07-15 11:49:28.959368] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:08:15.693 11:49:29 json_config -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:15.693 11:49:29 json_config -- common/autotest_common.sh@862 -- # return 0 00:08:15.693 11:49:29 json_config -- json_config/common.sh@26 -- # echo '' 00:08:15.693 00:08:15.693 11:49:29 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:08:15.693 11:49:29 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:08:15.693 INFO: Checking if target configuration is the same... 00:08:15.693 11:49:29 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:15.693 11:49:29 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:08:15.693 11:49:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:15.693 + '[' 2 -ne 2 ']' 00:08:15.693 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:08:15.693 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:08:15.693 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:15.693 +++ basename /dev/fd/62 00:08:15.693 ++ mktemp /tmp/62.XXX 00:08:15.693 + tmp_file_1=/tmp/62.cah 00:08:15.693 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:15.693 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:08:15.693 + tmp_file_2=/tmp/spdk_tgt_config.json.J7Q 00:08:15.693 + ret=0 00:08:15.693 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:16.263 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:16.263 + diff -u /tmp/62.cah /tmp/spdk_tgt_config.json.J7Q 00:08:16.263 + echo 'INFO: JSON config files are the same' 00:08:16.263 INFO: JSON config files are the same 00:08:16.263 + rm /tmp/62.cah /tmp/spdk_tgt_config.json.J7Q 00:08:16.263 + exit 0 00:08:16.263 11:49:29 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:08:16.263 11:49:29 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:08:16.263 INFO: changing configuration and checking if this can be detected... 00:08:16.263 11:49:29 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:08:16.263 11:49:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:08:16.263 11:49:29 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:16.263 11:49:29 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:08:16.263 11:49:29 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:08:16.263 + '[' 2 -ne 2 ']' 00:08:16.263 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:08:16.263 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:08:16.263 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:16.263 +++ basename /dev/fd/62 00:08:16.263 ++ mktemp /tmp/62.XXX 00:08:16.263 + tmp_file_1=/tmp/62.VOp 00:08:16.263 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:16.263 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:08:16.263 + tmp_file_2=/tmp/spdk_tgt_config.json.4lr 00:08:16.263 + ret=0 00:08:16.263 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:16.832 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:08:16.832 + diff -u /tmp/62.VOp /tmp/spdk_tgt_config.json.4lr 00:08:16.832 + ret=1 00:08:16.832 + echo '=== Start of file: /tmp/62.VOp ===' 00:08:16.832 + cat /tmp/62.VOp 00:08:16.832 + echo '=== End of file: /tmp/62.VOp ===' 00:08:16.832 + echo '' 00:08:16.832 + echo '=== Start of file: /tmp/spdk_tgt_config.json.4lr ===' 00:08:16.832 + cat /tmp/spdk_tgt_config.json.4lr 00:08:16.832 + echo '=== End of file: /tmp/spdk_tgt_config.json.4lr ===' 00:08:16.832 + echo '' 00:08:16.832 + rm /tmp/62.VOp /tmp/spdk_tgt_config.json.4lr 00:08:16.832 + exit 1 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:08:16.832 INFO: configuration change detected. 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:08:16.832 11:49:30 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:16.832 11:49:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@317 -- # [[ -n 1419043 ]] 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:08:16.832 11:49:30 json_config -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:16.832 11:49:30 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:08:16.832 11:49:30 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:08:16.832 11:49:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:08:17.092 11:49:30 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:08:17.092 11:49:30 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:08:17.660 11:49:31 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:08:17.660 11:49:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:08:17.920 11:49:31 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:08:17.920 11:49:31 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:08:18.179 11:49:31 json_config -- json_config/json_config.sh@193 -- # uname -s 00:08:18.179 11:49:31 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:08:18.179 11:49:31 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:08:18.179 11:49:31 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:08:18.179 11:49:31 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:18.179 11:49:31 json_config -- json_config/json_config.sh@323 -- # killprocess 1419043 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@948 -- # '[' -z 1419043 ']' 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@952 -- # kill -0 1419043 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@953 -- # uname 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1419043 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1419043' 00:08:18.179 killing process with pid 1419043 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@967 -- # kill 1419043 00:08:18.179 11:49:31 json_config -- common/autotest_common.sh@972 -- # wait 1419043 00:08:26.304 11:49:38 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:08:26.304 11:49:38 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:08:26.304 11:49:38 json_config -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:26.304 11:49:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:26.304 11:49:38 json_config -- json_config/json_config.sh@328 -- # return 0 00:08:26.304 11:49:38 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:08:26.304 INFO: Success 00:08:26.304 00:08:26.304 real 1m10.329s 00:08:26.304 user 1m15.330s 00:08:26.304 sys 0m4.309s 00:08:26.304 11:49:38 json_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:26.304 11:49:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:08:26.304 ************************************ 00:08:26.304 END TEST json_config 00:08:26.304 ************************************ 00:08:26.304 11:49:38 -- common/autotest_common.sh@1142 -- # return 0 00:08:26.304 11:49:38 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:08:26.304 11:49:38 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:26.304 11:49:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:26.304 11:49:38 -- common/autotest_common.sh@10 -- # set +x 00:08:26.304 ************************************ 00:08:26.304 START TEST json_config_extra_key 00:08:26.304 ************************************ 00:08:26.304 11:49:39 json_config_extra_key -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:005d867c-174e-e711-906e-0012795d9712 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=005d867c-174e-e711-906e-0012795d9712 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:08:26.304 11:49:39 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:08:26.304 11:49:39 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:08:26.304 11:49:39 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:08:26.304 11:49:39 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:26.304 11:49:39 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:26.304 11:49:39 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:26.304 11:49:39 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:08:26.304 11:49:39 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:08:26.304 11:49:39 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:08:26.304 INFO: launching applications... 00:08:26.304 11:49:39 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=1421270 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:08:26.304 Waiting for target to run... 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 1421270 /var/tmp/spdk_tgt.sock 00:08:26.304 11:49:39 json_config_extra_key -- common/autotest_common.sh@829 -- # '[' -z 1421270 ']' 00:08:26.304 11:49:39 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:08:26.304 11:49:39 json_config_extra_key -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:08:26.304 11:49:39 json_config_extra_key -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:26.304 11:49:39 json_config_extra_key -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:08:26.304 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:08:26.304 11:49:39 json_config_extra_key -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:26.304 11:49:39 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:08:26.304 [2024-07-15 11:49:39.206309] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:26.304 [2024-07-15 11:49:39.206370] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1421270 ] 00:08:26.304 [2024-07-15 11:49:39.756135] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:26.304 [2024-07-15 11:49:39.864575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:26.575 11:49:40 json_config_extra_key -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:26.575 11:49:40 json_config_extra_key -- common/autotest_common.sh@862 -- # return 0 00:08:26.575 11:49:40 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:08:26.575 00:08:26.575 11:49:40 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:08:26.575 INFO: shutting down applications... 00:08:26.575 11:49:40 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:08:26.575 11:49:40 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:08:26.575 11:49:40 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:08:26.575 11:49:40 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 1421270 ]] 00:08:26.575 11:49:40 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 1421270 00:08:26.575 11:49:40 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:08:26.575 11:49:40 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:26.576 11:49:40 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1421270 00:08:26.576 11:49:40 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:08:27.144 11:49:40 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:08:27.144 11:49:40 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:08:27.144 11:49:40 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 1421270 00:08:27.144 11:49:40 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:08:27.144 11:49:40 json_config_extra_key -- json_config/common.sh@43 -- # break 00:08:27.144 11:49:40 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:08:27.144 11:49:40 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:08:27.144 SPDK target shutdown done 00:08:27.144 11:49:40 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:08:27.144 Success 00:08:27.144 00:08:27.144 real 0m1.559s 00:08:27.144 user 0m0.984s 00:08:27.144 sys 0m0.698s 00:08:27.144 11:49:40 json_config_extra_key -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:27.144 11:49:40 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:08:27.144 ************************************ 00:08:27.144 END TEST json_config_extra_key 00:08:27.144 ************************************ 00:08:27.144 11:49:40 -- common/autotest_common.sh@1142 -- # return 0 00:08:27.144 11:49:40 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:08:27.144 11:49:40 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:27.144 11:49:40 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:27.144 11:49:40 -- common/autotest_common.sh@10 -- # set +x 00:08:27.144 ************************************ 00:08:27.144 START TEST alias_rpc 00:08:27.144 ************************************ 00:08:27.144 11:49:40 alias_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:08:27.403 * Looking for test storage... 00:08:27.403 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:08:27.403 11:49:40 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:27.403 11:49:40 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=1421500 00:08:27.403 11:49:40 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:27.403 11:49:40 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 1421500 00:08:27.403 11:49:40 alias_rpc -- common/autotest_common.sh@829 -- # '[' -z 1421500 ']' 00:08:27.403 11:49:40 alias_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:27.403 11:49:40 alias_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:27.403 11:49:40 alias_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:27.403 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:27.403 11:49:40 alias_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:27.403 11:49:40 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:27.403 [2024-07-15 11:49:40.861033] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:27.403 [2024-07-15 11:49:40.861104] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1421500 ] 00:08:27.403 [2024-07-15 11:49:40.991399] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.662 [2024-07-15 11:49:41.089551] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.229 11:49:41 alias_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:28.229 11:49:41 alias_rpc -- common/autotest_common.sh@862 -- # return 0 00:08:28.229 11:49:41 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:08:28.488 11:49:42 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 1421500 00:08:28.488 11:49:42 alias_rpc -- common/autotest_common.sh@948 -- # '[' -z 1421500 ']' 00:08:28.488 11:49:42 alias_rpc -- common/autotest_common.sh@952 -- # kill -0 1421500 00:08:28.488 11:49:42 alias_rpc -- common/autotest_common.sh@953 -- # uname 00:08:28.488 11:49:42 alias_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:28.488 11:49:42 alias_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1421500 00:08:28.747 11:49:42 alias_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:28.747 11:49:42 alias_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:28.747 11:49:42 alias_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1421500' 00:08:28.747 killing process with pid 1421500 00:08:28.747 11:49:42 alias_rpc -- common/autotest_common.sh@967 -- # kill 1421500 00:08:28.747 11:49:42 alias_rpc -- common/autotest_common.sh@972 -- # wait 1421500 00:08:29.006 00:08:29.006 real 0m1.813s 00:08:29.006 user 0m1.969s 00:08:29.006 sys 0m0.587s 00:08:29.006 11:49:42 alias_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:29.006 11:49:42 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:29.006 ************************************ 00:08:29.006 END TEST alias_rpc 00:08:29.006 ************************************ 00:08:29.006 11:49:42 -- common/autotest_common.sh@1142 -- # return 0 00:08:29.006 11:49:42 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:08:29.006 11:49:42 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:08:29.006 11:49:42 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:29.006 11:49:42 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:29.006 11:49:42 -- common/autotest_common.sh@10 -- # set +x 00:08:29.006 ************************************ 00:08:29.006 START TEST spdkcli_tcp 00:08:29.006 ************************************ 00:08:29.006 11:49:42 spdkcli_tcp -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:08:29.264 * Looking for test storage... 00:08:29.264 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:08:29.264 11:49:42 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:08:29.265 11:49:42 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:08:29.265 11:49:42 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:08:29.265 11:49:42 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:08:29.265 11:49:42 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:08:29.265 11:49:42 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:08:29.265 11:49:42 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:08:29.265 11:49:42 spdkcli_tcp -- common/autotest_common.sh@722 -- # xtrace_disable 00:08:29.265 11:49:42 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:29.265 11:49:42 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=1421751 00:08:29.265 11:49:42 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 1421751 00:08:29.265 11:49:42 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:08:29.265 11:49:42 spdkcli_tcp -- common/autotest_common.sh@829 -- # '[' -z 1421751 ']' 00:08:29.265 11:49:42 spdkcli_tcp -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:29.265 11:49:42 spdkcli_tcp -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:29.265 11:49:42 spdkcli_tcp -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:29.265 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:29.265 11:49:42 spdkcli_tcp -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:29.265 11:49:42 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:29.265 [2024-07-15 11:49:42.827675] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:29.265 [2024-07-15 11:49:42.827824] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1421751 ] 00:08:29.523 [2024-07-15 11:49:43.023507] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:29.781 [2024-07-15 11:49:43.124490] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:29.781 [2024-07-15 11:49:43.124495] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.348 11:49:43 spdkcli_tcp -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:30.348 11:49:43 spdkcli_tcp -- common/autotest_common.sh@862 -- # return 0 00:08:30.348 11:49:43 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:08:30.348 11:49:43 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=1421909 00:08:30.348 11:49:43 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:08:30.606 [ 00:08:30.606 "bdev_malloc_delete", 00:08:30.606 "bdev_malloc_create", 00:08:30.606 "bdev_null_resize", 00:08:30.606 "bdev_null_delete", 00:08:30.606 "bdev_null_create", 00:08:30.606 "bdev_nvme_cuse_unregister", 00:08:30.606 "bdev_nvme_cuse_register", 00:08:30.606 "bdev_opal_new_user", 00:08:30.606 "bdev_opal_set_lock_state", 00:08:30.606 "bdev_opal_delete", 00:08:30.606 "bdev_opal_get_info", 00:08:30.606 "bdev_opal_create", 00:08:30.606 "bdev_nvme_opal_revert", 00:08:30.606 "bdev_nvme_opal_init", 00:08:30.606 "bdev_nvme_send_cmd", 00:08:30.606 "bdev_nvme_get_path_iostat", 00:08:30.606 "bdev_nvme_get_mdns_discovery_info", 00:08:30.606 "bdev_nvme_stop_mdns_discovery", 00:08:30.606 "bdev_nvme_start_mdns_discovery", 00:08:30.606 "bdev_nvme_set_multipath_policy", 00:08:30.606 "bdev_nvme_set_preferred_path", 00:08:30.606 "bdev_nvme_get_io_paths", 00:08:30.606 "bdev_nvme_remove_error_injection", 00:08:30.606 "bdev_nvme_add_error_injection", 00:08:30.606 "bdev_nvme_get_discovery_info", 00:08:30.606 "bdev_nvme_stop_discovery", 00:08:30.606 "bdev_nvme_start_discovery", 00:08:30.606 "bdev_nvme_get_controller_health_info", 00:08:30.606 "bdev_nvme_disable_controller", 00:08:30.606 "bdev_nvme_enable_controller", 00:08:30.606 "bdev_nvme_reset_controller", 00:08:30.606 "bdev_nvme_get_transport_statistics", 00:08:30.606 "bdev_nvme_apply_firmware", 00:08:30.606 "bdev_nvme_detach_controller", 00:08:30.606 "bdev_nvme_get_controllers", 00:08:30.606 "bdev_nvme_attach_controller", 00:08:30.606 "bdev_nvme_set_hotplug", 00:08:30.606 "bdev_nvme_set_options", 00:08:30.606 "bdev_passthru_delete", 00:08:30.606 "bdev_passthru_create", 00:08:30.606 "bdev_lvol_set_parent_bdev", 00:08:30.606 "bdev_lvol_set_parent", 00:08:30.606 "bdev_lvol_check_shallow_copy", 00:08:30.606 "bdev_lvol_start_shallow_copy", 00:08:30.606 "bdev_lvol_grow_lvstore", 00:08:30.606 "bdev_lvol_get_lvols", 00:08:30.606 "bdev_lvol_get_lvstores", 00:08:30.606 "bdev_lvol_delete", 00:08:30.606 "bdev_lvol_set_read_only", 00:08:30.606 "bdev_lvol_resize", 00:08:30.606 "bdev_lvol_decouple_parent", 00:08:30.606 "bdev_lvol_inflate", 00:08:30.606 "bdev_lvol_rename", 00:08:30.606 "bdev_lvol_clone_bdev", 00:08:30.606 "bdev_lvol_clone", 00:08:30.606 "bdev_lvol_snapshot", 00:08:30.606 "bdev_lvol_create", 00:08:30.606 "bdev_lvol_delete_lvstore", 00:08:30.606 "bdev_lvol_rename_lvstore", 00:08:30.606 "bdev_lvol_create_lvstore", 00:08:30.606 "bdev_raid_set_options", 00:08:30.606 "bdev_raid_remove_base_bdev", 00:08:30.606 "bdev_raid_add_base_bdev", 00:08:30.606 "bdev_raid_delete", 00:08:30.606 "bdev_raid_create", 00:08:30.606 "bdev_raid_get_bdevs", 00:08:30.606 "bdev_error_inject_error", 00:08:30.606 "bdev_error_delete", 00:08:30.606 "bdev_error_create", 00:08:30.606 "bdev_split_delete", 00:08:30.606 "bdev_split_create", 00:08:30.606 "bdev_delay_delete", 00:08:30.606 "bdev_delay_create", 00:08:30.606 "bdev_delay_update_latency", 00:08:30.606 "bdev_zone_block_delete", 00:08:30.606 "bdev_zone_block_create", 00:08:30.606 "blobfs_create", 00:08:30.606 "blobfs_detect", 00:08:30.606 "blobfs_set_cache_size", 00:08:30.606 "bdev_crypto_delete", 00:08:30.606 "bdev_crypto_create", 00:08:30.606 "bdev_compress_delete", 00:08:30.606 "bdev_compress_create", 00:08:30.606 "bdev_compress_get_orphans", 00:08:30.606 "bdev_aio_delete", 00:08:30.606 "bdev_aio_rescan", 00:08:30.606 "bdev_aio_create", 00:08:30.606 "bdev_ftl_set_property", 00:08:30.606 "bdev_ftl_get_properties", 00:08:30.606 "bdev_ftl_get_stats", 00:08:30.606 "bdev_ftl_unmap", 00:08:30.606 "bdev_ftl_unload", 00:08:30.606 "bdev_ftl_delete", 00:08:30.606 "bdev_ftl_load", 00:08:30.606 "bdev_ftl_create", 00:08:30.606 "bdev_virtio_attach_controller", 00:08:30.606 "bdev_virtio_scsi_get_devices", 00:08:30.606 "bdev_virtio_detach_controller", 00:08:30.606 "bdev_virtio_blk_set_hotplug", 00:08:30.606 "bdev_iscsi_delete", 00:08:30.606 "bdev_iscsi_create", 00:08:30.606 "bdev_iscsi_set_options", 00:08:30.606 "accel_error_inject_error", 00:08:30.606 "ioat_scan_accel_module", 00:08:30.606 "dsa_scan_accel_module", 00:08:30.606 "iaa_scan_accel_module", 00:08:30.606 "dpdk_cryptodev_get_driver", 00:08:30.606 "dpdk_cryptodev_set_driver", 00:08:30.606 "dpdk_cryptodev_scan_accel_module", 00:08:30.606 "compressdev_scan_accel_module", 00:08:30.606 "keyring_file_remove_key", 00:08:30.606 "keyring_file_add_key", 00:08:30.606 "keyring_linux_set_options", 00:08:30.606 "iscsi_get_histogram", 00:08:30.606 "iscsi_enable_histogram", 00:08:30.606 "iscsi_set_options", 00:08:30.606 "iscsi_get_auth_groups", 00:08:30.606 "iscsi_auth_group_remove_secret", 00:08:30.606 "iscsi_auth_group_add_secret", 00:08:30.606 "iscsi_delete_auth_group", 00:08:30.606 "iscsi_create_auth_group", 00:08:30.606 "iscsi_set_discovery_auth", 00:08:30.606 "iscsi_get_options", 00:08:30.606 "iscsi_target_node_request_logout", 00:08:30.606 "iscsi_target_node_set_redirect", 00:08:30.606 "iscsi_target_node_set_auth", 00:08:30.606 "iscsi_target_node_add_lun", 00:08:30.606 "iscsi_get_stats", 00:08:30.606 "iscsi_get_connections", 00:08:30.606 "iscsi_portal_group_set_auth", 00:08:30.606 "iscsi_start_portal_group", 00:08:30.606 "iscsi_delete_portal_group", 00:08:30.606 "iscsi_create_portal_group", 00:08:30.606 "iscsi_get_portal_groups", 00:08:30.606 "iscsi_delete_target_node", 00:08:30.606 "iscsi_target_node_remove_pg_ig_maps", 00:08:30.606 "iscsi_target_node_add_pg_ig_maps", 00:08:30.606 "iscsi_create_target_node", 00:08:30.606 "iscsi_get_target_nodes", 00:08:30.606 "iscsi_delete_initiator_group", 00:08:30.606 "iscsi_initiator_group_remove_initiators", 00:08:30.606 "iscsi_initiator_group_add_initiators", 00:08:30.606 "iscsi_create_initiator_group", 00:08:30.606 "iscsi_get_initiator_groups", 00:08:30.606 "nvmf_set_crdt", 00:08:30.606 "nvmf_set_config", 00:08:30.606 "nvmf_set_max_subsystems", 00:08:30.606 "nvmf_stop_mdns_prr", 00:08:30.606 "nvmf_publish_mdns_prr", 00:08:30.606 "nvmf_subsystem_get_listeners", 00:08:30.606 "nvmf_subsystem_get_qpairs", 00:08:30.606 "nvmf_subsystem_get_controllers", 00:08:30.606 "nvmf_get_stats", 00:08:30.606 "nvmf_get_transports", 00:08:30.606 "nvmf_create_transport", 00:08:30.606 "nvmf_get_targets", 00:08:30.606 "nvmf_delete_target", 00:08:30.606 "nvmf_create_target", 00:08:30.606 "nvmf_subsystem_allow_any_host", 00:08:30.606 "nvmf_subsystem_remove_host", 00:08:30.606 "nvmf_subsystem_add_host", 00:08:30.606 "nvmf_ns_remove_host", 00:08:30.606 "nvmf_ns_add_host", 00:08:30.606 "nvmf_subsystem_remove_ns", 00:08:30.606 "nvmf_subsystem_add_ns", 00:08:30.606 "nvmf_subsystem_listener_set_ana_state", 00:08:30.607 "nvmf_discovery_get_referrals", 00:08:30.607 "nvmf_discovery_remove_referral", 00:08:30.607 "nvmf_discovery_add_referral", 00:08:30.607 "nvmf_subsystem_remove_listener", 00:08:30.607 "nvmf_subsystem_add_listener", 00:08:30.607 "nvmf_delete_subsystem", 00:08:30.607 "nvmf_create_subsystem", 00:08:30.607 "nvmf_get_subsystems", 00:08:30.607 "env_dpdk_get_mem_stats", 00:08:30.607 "nbd_get_disks", 00:08:30.607 "nbd_stop_disk", 00:08:30.607 "nbd_start_disk", 00:08:30.607 "ublk_recover_disk", 00:08:30.607 "ublk_get_disks", 00:08:30.607 "ublk_stop_disk", 00:08:30.607 "ublk_start_disk", 00:08:30.607 "ublk_destroy_target", 00:08:30.607 "ublk_create_target", 00:08:30.607 "virtio_blk_create_transport", 00:08:30.607 "virtio_blk_get_transports", 00:08:30.607 "vhost_controller_set_coalescing", 00:08:30.607 "vhost_get_controllers", 00:08:30.607 "vhost_delete_controller", 00:08:30.607 "vhost_create_blk_controller", 00:08:30.607 "vhost_scsi_controller_remove_target", 00:08:30.607 "vhost_scsi_controller_add_target", 00:08:30.607 "vhost_start_scsi_controller", 00:08:30.607 "vhost_create_scsi_controller", 00:08:30.607 "thread_set_cpumask", 00:08:30.607 "framework_get_governor", 00:08:30.607 "framework_get_scheduler", 00:08:30.607 "framework_set_scheduler", 00:08:30.607 "framework_get_reactors", 00:08:30.607 "thread_get_io_channels", 00:08:30.607 "thread_get_pollers", 00:08:30.607 "thread_get_stats", 00:08:30.607 "framework_monitor_context_switch", 00:08:30.607 "spdk_kill_instance", 00:08:30.607 "log_enable_timestamps", 00:08:30.607 "log_get_flags", 00:08:30.607 "log_clear_flag", 00:08:30.607 "log_set_flag", 00:08:30.607 "log_get_level", 00:08:30.607 "log_set_level", 00:08:30.607 "log_get_print_level", 00:08:30.607 "log_set_print_level", 00:08:30.607 "framework_enable_cpumask_locks", 00:08:30.607 "framework_disable_cpumask_locks", 00:08:30.607 "framework_wait_init", 00:08:30.607 "framework_start_init", 00:08:30.607 "scsi_get_devices", 00:08:30.607 "bdev_get_histogram", 00:08:30.607 "bdev_enable_histogram", 00:08:30.607 "bdev_set_qos_limit", 00:08:30.607 "bdev_set_qd_sampling_period", 00:08:30.607 "bdev_get_bdevs", 00:08:30.607 "bdev_reset_iostat", 00:08:30.607 "bdev_get_iostat", 00:08:30.607 "bdev_examine", 00:08:30.607 "bdev_wait_for_examine", 00:08:30.607 "bdev_set_options", 00:08:30.607 "notify_get_notifications", 00:08:30.607 "notify_get_types", 00:08:30.607 "accel_get_stats", 00:08:30.607 "accel_set_options", 00:08:30.607 "accel_set_driver", 00:08:30.607 "accel_crypto_key_destroy", 00:08:30.607 "accel_crypto_keys_get", 00:08:30.607 "accel_crypto_key_create", 00:08:30.607 "accel_assign_opc", 00:08:30.607 "accel_get_module_info", 00:08:30.607 "accel_get_opc_assignments", 00:08:30.607 "vmd_rescan", 00:08:30.607 "vmd_remove_device", 00:08:30.607 "vmd_enable", 00:08:30.607 "sock_get_default_impl", 00:08:30.607 "sock_set_default_impl", 00:08:30.607 "sock_impl_set_options", 00:08:30.607 "sock_impl_get_options", 00:08:30.607 "iobuf_get_stats", 00:08:30.607 "iobuf_set_options", 00:08:30.607 "framework_get_pci_devices", 00:08:30.607 "framework_get_config", 00:08:30.607 "framework_get_subsystems", 00:08:30.607 "trace_get_info", 00:08:30.607 "trace_get_tpoint_group_mask", 00:08:30.607 "trace_disable_tpoint_group", 00:08:30.607 "trace_enable_tpoint_group", 00:08:30.607 "trace_clear_tpoint_mask", 00:08:30.607 "trace_set_tpoint_mask", 00:08:30.607 "keyring_get_keys", 00:08:30.607 "spdk_get_version", 00:08:30.607 "rpc_get_methods" 00:08:30.607 ] 00:08:30.865 11:49:44 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@728 -- # xtrace_disable 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:30.865 11:49:44 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:08:30.865 11:49:44 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 1421751 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@948 -- # '[' -z 1421751 ']' 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@952 -- # kill -0 1421751 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@953 -- # uname 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1421751 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1421751' 00:08:30.865 killing process with pid 1421751 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@967 -- # kill 1421751 00:08:30.865 11:49:44 spdkcli_tcp -- common/autotest_common.sh@972 -- # wait 1421751 00:08:31.123 00:08:31.123 real 0m2.114s 00:08:31.123 user 0m3.942s 00:08:31.123 sys 0m0.704s 00:08:31.123 11:49:44 spdkcli_tcp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:31.123 11:49:44 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:08:31.123 ************************************ 00:08:31.123 END TEST spdkcli_tcp 00:08:31.123 ************************************ 00:08:31.382 11:49:44 -- common/autotest_common.sh@1142 -- # return 0 00:08:31.382 11:49:44 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:08:31.382 11:49:44 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:31.382 11:49:44 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:31.382 11:49:44 -- common/autotest_common.sh@10 -- # set +x 00:08:31.382 ************************************ 00:08:31.382 START TEST dpdk_mem_utility 00:08:31.382 ************************************ 00:08:31.382 11:49:44 dpdk_mem_utility -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:08:31.382 * Looking for test storage... 00:08:31.382 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:08:31.382 11:49:44 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:08:31.382 11:49:44 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=1422154 00:08:31.382 11:49:44 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 1422154 00:08:31.382 11:49:44 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:08:31.382 11:49:44 dpdk_mem_utility -- common/autotest_common.sh@829 -- # '[' -z 1422154 ']' 00:08:31.382 11:49:44 dpdk_mem_utility -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:31.382 11:49:44 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:31.382 11:49:44 dpdk_mem_utility -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:31.382 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:31.382 11:49:44 dpdk_mem_utility -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:31.382 11:49:44 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:31.382 [2024-07-15 11:49:44.954052] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:31.382 [2024-07-15 11:49:44.954115] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1422154 ] 00:08:31.640 [2024-07-15 11:49:45.065458] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.640 [2024-07-15 11:49:45.164009] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.576 11:49:45 dpdk_mem_utility -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:32.576 11:49:45 dpdk_mem_utility -- common/autotest_common.sh@862 -- # return 0 00:08:32.576 11:49:45 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:08:32.576 11:49:45 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:08:32.576 11:49:45 dpdk_mem_utility -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:32.576 11:49:45 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:32.576 { 00:08:32.576 "filename": "/tmp/spdk_mem_dump.txt" 00:08:32.576 } 00:08:32.576 11:49:45 dpdk_mem_utility -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:32.576 11:49:45 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:08:32.576 DPDK memory size 816.000000 MiB in 2 heap(s) 00:08:32.576 2 heaps totaling size 816.000000 MiB 00:08:32.576 size: 814.000000 MiB heap id: 0 00:08:32.576 size: 2.000000 MiB heap id: 1 00:08:32.576 end heaps---------- 00:08:32.576 8 mempools totaling size 598.116089 MiB 00:08:32.576 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:08:32.576 size: 158.602051 MiB name: PDU_data_out_Pool 00:08:32.576 size: 84.521057 MiB name: bdev_io_1422154 00:08:32.576 size: 51.011292 MiB name: evtpool_1422154 00:08:32.576 size: 50.003479 MiB name: msgpool_1422154 00:08:32.576 size: 21.763794 MiB name: PDU_Pool 00:08:32.576 size: 19.513306 MiB name: SCSI_TASK_Pool 00:08:32.576 size: 0.026123 MiB name: Session_Pool 00:08:32.576 end mempools------- 00:08:32.576 201 memzones totaling size 4.176453 MiB 00:08:32.576 size: 1.000366 MiB name: RG_ring_0_1422154 00:08:32.576 size: 1.000366 MiB name: RG_ring_1_1422154 00:08:32.576 size: 1.000366 MiB name: RG_ring_4_1422154 00:08:32.576 size: 1.000366 MiB name: RG_ring_5_1422154 00:08:32.576 size: 0.125366 MiB name: RG_ring_2_1422154 00:08:32.576 size: 0.015991 MiB name: RG_ring_3_1422154 00:08:32.576 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:32.576 size: 0.000305 MiB name: 0000:3d:01.0_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:01.1_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:01.2_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:01.3_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:01.4_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:01.5_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:01.6_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:01.7_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:02.0_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:02.1_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:02.2_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:02.3_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:02.4_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:02.5_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:02.6_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3d:02.7_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:01.0_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:01.1_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:01.2_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:01.3_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:01.4_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:01.5_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:01.6_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:01.7_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:02.0_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:02.1_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:02.2_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:02.3_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:02.4_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:02.5_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:02.6_qat 00:08:32.576 size: 0.000305 MiB name: 0000:3f:02.7_qat 00:08:32.576 size: 0.000305 MiB name: 0000:da:01.0_qat 00:08:32.576 size: 0.000305 MiB name: 0000:da:01.1_qat 00:08:32.576 size: 0.000305 MiB name: 0000:da:01.2_qat 00:08:32.576 size: 0.000305 MiB name: 0000:da:01.3_qat 00:08:32.576 size: 0.000305 MiB name: 0000:da:01.4_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:01.5_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:01.6_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:01.7_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:02.0_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:02.1_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:02.2_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:02.3_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:02.4_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:02.5_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:02.6_qat 00:08:32.577 size: 0.000305 MiB name: 0000:da:02.7_qat 00:08:32.577 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_0 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_1 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_0 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_2 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_3 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_1 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_4 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_5 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_2 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_6 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_7 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_3 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_8 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_9 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_4 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_10 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_11 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_5 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_12 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_13 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_6 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_14 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_15 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_7 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_16 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_17 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_8 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_18 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_19 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_9 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_20 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_21 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_10 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_22 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_23 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_11 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_24 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_25 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_12 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_26 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_27 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_13 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_28 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_29 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_14 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_30 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_31 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_15 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_32 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_33 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_16 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_34 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_35 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_17 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_36 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_37 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_18 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_38 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_39 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_19 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_40 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_41 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_20 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_42 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_43 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_21 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_44 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_45 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_22 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_46 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_47 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_23 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_48 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_49 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_24 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_50 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_51 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_25 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_52 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_53 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_26 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_54 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_55 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_27 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_56 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_57 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_28 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_58 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_59 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_29 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_60 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_61 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_30 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_62 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_63 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_31 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_64 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_65 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_32 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_66 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_67 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_33 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_68 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_69 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_34 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_70 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_71 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_35 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_72 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_73 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_36 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_74 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_75 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_37 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_76 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_77 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_38 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_78 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_79 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_39 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_80 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_81 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_40 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_82 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_83 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_41 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_84 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_85 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_42 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_86 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_87 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_43 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_88 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_89 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_44 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_90 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_91 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_45 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_92 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_93 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_46 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_94 00:08:32.577 size: 0.000122 MiB name: rte_cryptodev_data_95 00:08:32.577 size: 0.000122 MiB name: rte_compressdev_data_47 00:08:32.577 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:32.577 end memzones------- 00:08:32.577 11:49:45 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:08:32.577 heap id: 0 total size: 814.000000 MiB number of busy elements: 594 number of free elements: 14 00:08:32.577 list of free elements. size: 11.801086 MiB 00:08:32.577 element at address: 0x200000400000 with size: 1.999512 MiB 00:08:32.577 element at address: 0x200018e00000 with size: 0.999878 MiB 00:08:32.577 element at address: 0x200019000000 with size: 0.999878 MiB 00:08:32.577 element at address: 0x200003e00000 with size: 0.996460 MiB 00:08:32.577 element at address: 0x200031c00000 with size: 0.994446 MiB 00:08:32.577 element at address: 0x200013800000 with size: 0.978882 MiB 00:08:32.577 element at address: 0x200007000000 with size: 0.959839 MiB 00:08:32.577 element at address: 0x200019200000 with size: 0.937256 MiB 00:08:32.577 element at address: 0x20001aa00000 with size: 0.577393 MiB 00:08:32.577 element at address: 0x200003a00000 with size: 0.498535 MiB 00:08:32.577 element at address: 0x20000b200000 with size: 0.491272 MiB 00:08:32.577 element at address: 0x200000800000 with size: 0.486145 MiB 00:08:32.577 element at address: 0x200019400000 with size: 0.485840 MiB 00:08:32.577 element at address: 0x200027e00000 with size: 0.395752 MiB 00:08:32.577 list of standard malloc elements. size: 199.890625 MiB 00:08:32.577 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:08:32.577 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:08:32.577 element at address: 0x200018efff80 with size: 1.000122 MiB 00:08:32.577 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:08:32.577 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:08:32.577 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:08:32.578 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:08:32.578 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:08:32.578 element at address: 0x200000330b40 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003340c0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000337640 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000033abc0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000033e140 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003416c0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000344c40 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003481c0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000034b740 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000034ecc0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000352240 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003557c0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000358d40 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000035c2c0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000035f840 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000366880 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000036a340 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000036de00 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000375380 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000378e40 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000037c900 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000383e80 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000387940 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000038b400 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000392980 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000396440 with size: 0.004395 MiB 00:08:32.578 element at address: 0x200000399f00 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:08:32.578 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:08:32.578 element at address: 0x20000032ea40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000032fac0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000331fc0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000333040 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000335540 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003365c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000338ac0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000339b40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000033c040 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000033d0c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000033f5c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000340640 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000342b40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000343bc0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003460c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000347140 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000349640 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000034a6c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000034cbc0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000034dc40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000350140 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003511c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003536c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000354740 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000356c40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000357cc0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000035a1c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000035b240 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000035d740 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000035e7c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000361d40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000364780 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000365800 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000368240 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000370840 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000373280 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000374300 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000376d40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000037a800 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000037b880 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000037f340 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000381d80 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000382e00 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000385840 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000389300 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000038a380 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000038de40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000390880 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000391900 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000394340 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000397e00 with size: 0.004028 MiB 00:08:32.578 element at address: 0x200000398e80 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000039c940 with size: 0.004028 MiB 00:08:32.578 element at address: 0x20000039f380 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:08:32.578 element at address: 0x2000002042c0 with size: 0.000305 MiB 00:08:32.578 element at address: 0x200000200000 with size: 0.000183 MiB 00:08:32.578 element at address: 0x2000002000c0 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200180 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200240 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200300 with size: 0.000183 MiB 00:08:32.578 element at address: 0x2000002003c0 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200480 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200540 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200600 with size: 0.000183 MiB 00:08:32.578 element at address: 0x2000002006c0 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200780 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200840 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200900 with size: 0.000183 MiB 00:08:32.578 element at address: 0x2000002009c0 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200a80 with size: 0.000183 MiB 00:08:32.578 element at address: 0x200000200b40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000200c00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000200cc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000200d80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000200e40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000200f00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000200fc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201080 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201140 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201200 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002012c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201380 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201440 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201500 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002015c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201680 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201740 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201800 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002018c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201980 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201a40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201b00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201bc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201c80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201d40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201e00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201ec0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000201f80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202040 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202100 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002021c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202280 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202340 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202400 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002024c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202580 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202640 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202700 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002027c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202880 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202940 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202a00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202ac0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202b80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202c40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202d00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202dc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202e80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000202f40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203000 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002030c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203180 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203240 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203300 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002033c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203480 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203540 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203600 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002036c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203780 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203840 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203900 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002039c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203a80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203b40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203c00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203cc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203d80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203e40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203f00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000203fc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204080 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204140 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204200 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204400 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002044c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204580 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204640 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204700 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002047c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204880 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204940 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204a00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204ac0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204b80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204c40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204d00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204dc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204e80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000204f40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205000 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002050c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205180 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205240 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205300 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002053c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205480 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205540 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205600 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002056c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205780 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205840 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205900 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002059c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205a80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205b40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205c00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205cc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205d80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205e40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205f00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000205fc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000206080 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000206140 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000206200 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002062c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000002064c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000020a780 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022aa40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022ab00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022abc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022ac80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022ad40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022ae00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022aec0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022af80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b040 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b100 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b1c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b280 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b340 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b400 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b4c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b580 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b640 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b700 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b900 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022b9c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022ba80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022bb40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022bc00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022bcc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022bd80 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022be40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022bf00 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022bfc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022c080 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022c140 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022c200 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022c2c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022c380 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022c440 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000022c500 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000032e700 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000032e7c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000331d40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000003352c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000338840 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000033f340 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000003428c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x200000345e40 with size: 0.000183 MiB 00:08:32.579 element at address: 0x2000003493c0 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000034c940 with size: 0.000183 MiB 00:08:32.579 element at address: 0x20000034fec0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000353440 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003569c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000359f40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000035d4c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000360a40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000364180 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000364240 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000364400 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000367a80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000367c40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000367d00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000036b540 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000036b700 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000036b980 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000036f000 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000036f280 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000036f440 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000372c80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000372d40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000372f00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000376580 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000376740 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000376800 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000037a040 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000037a200 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000037a480 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000037db00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000037df40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000381780 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000381840 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000381a00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000385080 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000385240 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000385300 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000388b40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000388d00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000388f80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000038c600 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000038c880 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000390280 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000390340 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000390500 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000393b80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000393d40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000393e00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000397640 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000397800 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x200000397a80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000039b100 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000039b380 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000039b540 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000039f000 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087c740 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087c800 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087c980 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:08:32.580 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:08:32.580 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:08:32.581 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e65500 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e655c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c1c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c3c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c480 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c540 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c600 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c6c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c780 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c840 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c900 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6c9c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6ca80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6cb40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6cc00 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6ccc0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6cd80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6ce40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6cf00 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6cfc0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d080 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d140 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:08:32.581 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:08:32.581 list of memzone associated elements. size: 602.308289 MiB 00:08:32.581 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:08:32.581 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:08:32.581 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:08:32.581 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:08:32.581 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:08:32.581 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_1422154_0 00:08:32.581 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:08:32.581 associated memzone info: size: 48.002930 MiB name: MP_evtpool_1422154_0 00:08:32.581 element at address: 0x200003fff380 with size: 48.003052 MiB 00:08:32.581 associated memzone info: size: 48.002930 MiB name: MP_msgpool_1422154_0 00:08:32.581 element at address: 0x2000195be940 with size: 20.255554 MiB 00:08:32.581 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:08:32.581 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:08:32.581 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:08:32.581 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:08:32.581 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_1422154 00:08:32.581 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:08:32.581 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_1422154 00:08:32.581 element at address: 0x20000022c5c0 with size: 1.008118 MiB 00:08:32.581 associated memzone info: size: 1.007996 MiB name: MP_evtpool_1422154 00:08:32.581 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:08:32.581 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:08:32.581 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:08:32.581 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:08:32.581 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:08:32.581 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:08:32.581 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:08:32.581 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:08:32.581 element at address: 0x200003eff180 with size: 1.000488 MiB 00:08:32.581 associated memzone info: size: 1.000366 MiB name: RG_ring_0_1422154 00:08:32.581 element at address: 0x200003affc00 with size: 1.000488 MiB 00:08:32.581 associated memzone info: size: 1.000366 MiB name: RG_ring_1_1422154 00:08:32.581 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:08:32.582 associated memzone info: size: 1.000366 MiB name: RG_ring_4_1422154 00:08:32.582 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:08:32.582 associated memzone info: size: 1.000366 MiB name: RG_ring_5_1422154 00:08:32.582 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:08:32.582 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_1422154 00:08:32.582 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:08:32.582 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:08:32.582 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:08:32.582 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:08:32.582 element at address: 0x20001947c600 with size: 0.250488 MiB 00:08:32.582 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:08:32.582 element at address: 0x20000020a840 with size: 0.125488 MiB 00:08:32.582 associated memzone info: size: 0.125366 MiB name: RG_ring_2_1422154 00:08:32.582 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:08:32.582 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:08:32.582 element at address: 0x200027e65680 with size: 0.023743 MiB 00:08:32.582 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:08:32.582 element at address: 0x200000206580 with size: 0.016113 MiB 00:08:32.582 associated memzone info: size: 0.015991 MiB name: RG_ring_3_1422154 00:08:32.582 element at address: 0x200027e6b7c0 with size: 0.002441 MiB 00:08:32.582 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:08:32.582 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:08:32.582 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:08:32.582 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.0_qat 00:08:32.582 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.1_qat 00:08:32.582 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.2_qat 00:08:32.582 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.3_qat 00:08:32.582 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.4_qat 00:08:32.582 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.5_qat 00:08:32.582 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.6_qat 00:08:32.582 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:01.7_qat 00:08:32.582 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.0_qat 00:08:32.582 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.1_qat 00:08:32.582 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.2_qat 00:08:32.582 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.3_qat 00:08:32.582 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.4_qat 00:08:32.582 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.5_qat 00:08:32.582 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.6_qat 00:08:32.582 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3d:02.7_qat 00:08:32.582 element at address: 0x20000039b700 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.0_qat 00:08:32.582 element at address: 0x200000397c40 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.1_qat 00:08:32.582 element at address: 0x200000394180 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.2_qat 00:08:32.582 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.3_qat 00:08:32.582 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.4_qat 00:08:32.582 element at address: 0x200000389140 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.5_qat 00:08:32.582 element at address: 0x200000385680 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.6_qat 00:08:32.582 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:01.7_qat 00:08:32.582 element at address: 0x20000037e100 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.0_qat 00:08:32.582 element at address: 0x20000037a640 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.1_qat 00:08:32.582 element at address: 0x200000376b80 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.2_qat 00:08:32.582 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.3_qat 00:08:32.582 element at address: 0x20000036f600 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.4_qat 00:08:32.582 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.5_qat 00:08:32.582 element at address: 0x200000368080 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.6_qat 00:08:32.582 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:3f:02.7_qat 00:08:32.582 element at address: 0x200000360b00 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:01.0_qat 00:08:32.582 element at address: 0x20000035d580 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:01.1_qat 00:08:32.582 element at address: 0x20000035a000 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:01.2_qat 00:08:32.582 element at address: 0x200000356a80 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:01.3_qat 00:08:32.582 element at address: 0x200000353500 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:01.4_qat 00:08:32.582 element at address: 0x20000034ff80 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:01.5_qat 00:08:32.582 element at address: 0x20000034ca00 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:01.6_qat 00:08:32.582 element at address: 0x200000349480 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:01.7_qat 00:08:32.582 element at address: 0x200000345f00 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:02.0_qat 00:08:32.582 element at address: 0x200000342980 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:02.1_qat 00:08:32.582 element at address: 0x20000033f400 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:02.2_qat 00:08:32.582 element at address: 0x20000033be80 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:02.3_qat 00:08:32.582 element at address: 0x200000338900 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:02.4_qat 00:08:32.582 element at address: 0x200000335380 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:02.5_qat 00:08:32.582 element at address: 0x200000331e00 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:02.6_qat 00:08:32.582 element at address: 0x20000032e880 with size: 0.000427 MiB 00:08:32.582 associated memzone info: size: 0.000305 MiB name: 0000:da:02.7_qat 00:08:32.582 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:08:32.582 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:08:32.582 element at address: 0x20000022b7c0 with size: 0.000305 MiB 00:08:32.582 associated memzone info: size: 0.000183 MiB name: MP_msgpool_1422154 00:08:32.582 element at address: 0x200000206380 with size: 0.000305 MiB 00:08:32.582 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_1422154 00:08:32.582 element at address: 0x200027e6c280 with size: 0.000305 MiB 00:08:32.582 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:08:32.582 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:08:32.582 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:08:32.582 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:08:32.582 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:08:32.582 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:08:32.582 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:08:32.582 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:08:32.582 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:08:32.582 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:08:32.582 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:08:32.582 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:08:32.582 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:08:32.582 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:08:32.582 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:08:32.582 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:08:32.582 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:08:32.582 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:08:32.583 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:08:32.583 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:08:32.583 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:08:32.583 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:08:32.583 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:08:32.583 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:08:32.583 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:08:32.583 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:08:32.583 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:08:32.583 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:08:32.583 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:08:32.583 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:08:32.583 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:08:32.583 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:08:32.583 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:08:32.583 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:08:32.583 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:08:32.583 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:08:32.583 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:08:32.583 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:08:32.583 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:08:32.583 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:08:32.583 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:08:32.583 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:08:32.583 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:08:32.583 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:08:32.583 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:08:32.583 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:08:32.583 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:08:32.583 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:08:32.583 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:08:32.583 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:08:32.583 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:08:32.583 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:08:32.583 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:08:32.583 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:08:32.583 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:08:32.583 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:08:32.583 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:08:32.583 element at address: 0x20000039b600 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:08:32.583 element at address: 0x20000039b440 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:08:32.583 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:08:32.583 element at address: 0x200000397b40 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:08:32.583 element at address: 0x200000397980 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:08:32.583 element at address: 0x200000397700 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:08:32.583 element at address: 0x200000394080 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:08:32.583 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:08:32.583 element at address: 0x200000393c40 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:08:32.583 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:08:32.583 element at address: 0x200000390400 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:08:32.583 element at address: 0x200000390180 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:08:32.583 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:08:32.583 element at address: 0x20000038c940 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:08:32.583 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:08:32.583 element at address: 0x200000389040 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:08:32.583 element at address: 0x200000388e80 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:08:32.583 element at address: 0x200000388c00 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:08:32.583 element at address: 0x200000385580 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:08:32.583 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:08:32.583 element at address: 0x200000385140 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:08:32.583 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:08:32.583 element at address: 0x200000381900 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:08:32.583 element at address: 0x200000381680 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:08:32.583 element at address: 0x20000037e000 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:08:32.583 element at address: 0x20000037de40 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:08:32.583 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:08:32.583 element at address: 0x20000037a540 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:08:32.583 element at address: 0x20000037a380 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:08:32.583 element at address: 0x20000037a100 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:08:32.583 element at address: 0x200000376a80 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:08:32.583 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:08:32.583 element at address: 0x200000376640 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:08:32.583 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:08:32.583 element at address: 0x200000372e00 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:08:32.583 element at address: 0x200000372b80 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:08:32.583 element at address: 0x20000036f500 with size: 0.000244 MiB 00:08:32.583 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:08:32.583 element at address: 0x20000036f340 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:08:32.584 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:08:32.584 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:08:32.584 element at address: 0x20000036b880 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:08:32.584 element at address: 0x20000036b600 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:08:32.584 element at address: 0x200000367f80 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:08:32.584 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:08:32.584 element at address: 0x200000367b40 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:08:32.584 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:08:32.584 element at address: 0x200000364300 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:08:32.584 element at address: 0x200000364080 with size: 0.000244 MiB 00:08:32.584 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:08:32.584 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:08:32.584 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:08:32.584 11:49:46 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:08:32.584 11:49:46 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 1422154 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@948 -- # '[' -z 1422154 ']' 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@952 -- # kill -0 1422154 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@953 -- # uname 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1422154 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1422154' 00:08:32.584 killing process with pid 1422154 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@967 -- # kill 1422154 00:08:32.584 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@972 -- # wait 1422154 00:08:33.150 00:08:33.150 real 0m1.709s 00:08:33.150 user 0m1.820s 00:08:33.150 sys 0m0.560s 00:08:33.150 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:33.150 11:49:46 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:08:33.150 ************************************ 00:08:33.150 END TEST dpdk_mem_utility 00:08:33.150 ************************************ 00:08:33.150 11:49:46 -- common/autotest_common.sh@1142 -- # return 0 00:08:33.150 11:49:46 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:33.150 11:49:46 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:33.150 11:49:46 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.150 11:49:46 -- common/autotest_common.sh@10 -- # set +x 00:08:33.150 ************************************ 00:08:33.150 START TEST event 00:08:33.150 ************************************ 00:08:33.150 11:49:46 event -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:08:33.150 * Looking for test storage... 00:08:33.150 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:08:33.150 11:49:46 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:08:33.150 11:49:46 event -- bdev/nbd_common.sh@6 -- # set -e 00:08:33.150 11:49:46 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:33.150 11:49:46 event -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:08:33.150 11:49:46 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:33.150 11:49:46 event -- common/autotest_common.sh@10 -- # set +x 00:08:33.150 ************************************ 00:08:33.150 START TEST event_perf 00:08:33.150 ************************************ 00:08:33.150 11:49:46 event.event_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:08:33.410 Running I/O for 1 seconds...[2024-07-15 11:49:46.761257] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:33.410 [2024-07-15 11:49:46.761330] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1422395 ] 00:08:33.410 [2024-07-15 11:49:46.891138] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:33.410 [2024-07-15 11:49:46.996165] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:33.410 [2024-07-15 11:49:46.996196] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:33.410 [2024-07-15 11:49:46.996282] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:33.410 Running I/O for 1 seconds...[2024-07-15 11:49:46.996281] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:34.871 00:08:34.871 lcore 0: 102918 00:08:34.871 lcore 1: 102920 00:08:34.871 lcore 2: 102922 00:08:34.871 lcore 3: 102923 00:08:34.871 done. 00:08:34.871 00:08:34.871 real 0m1.357s 00:08:34.871 user 0m4.210s 00:08:34.871 sys 0m0.135s 00:08:34.871 11:49:48 event.event_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:34.871 11:49:48 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:08:34.871 ************************************ 00:08:34.871 END TEST event_perf 00:08:34.871 ************************************ 00:08:34.871 11:49:48 event -- common/autotest_common.sh@1142 -- # return 0 00:08:34.871 11:49:48 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:34.871 11:49:48 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:34.871 11:49:48 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:34.871 11:49:48 event -- common/autotest_common.sh@10 -- # set +x 00:08:34.871 ************************************ 00:08:34.871 START TEST event_reactor 00:08:34.871 ************************************ 00:08:34.871 11:49:48 event.event_reactor -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:08:34.871 [2024-07-15 11:49:48.198656] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:34.871 [2024-07-15 11:49:48.198725] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1422647 ] 00:08:34.871 [2024-07-15 11:49:48.325204] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.871 [2024-07-15 11:49:48.430360] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.250 test_start 00:08:36.250 oneshot 00:08:36.250 tick 100 00:08:36.250 tick 100 00:08:36.250 tick 250 00:08:36.250 tick 100 00:08:36.250 tick 100 00:08:36.250 tick 250 00:08:36.250 tick 100 00:08:36.250 tick 500 00:08:36.250 tick 100 00:08:36.250 tick 100 00:08:36.250 tick 250 00:08:36.250 tick 100 00:08:36.250 tick 100 00:08:36.250 test_end 00:08:36.250 00:08:36.250 real 0m1.344s 00:08:36.250 user 0m1.205s 00:08:36.250 sys 0m0.133s 00:08:36.250 11:49:49 event.event_reactor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:36.250 11:49:49 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:08:36.250 ************************************ 00:08:36.250 END TEST event_reactor 00:08:36.250 ************************************ 00:08:36.250 11:49:49 event -- common/autotest_common.sh@1142 -- # return 0 00:08:36.250 11:49:49 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:36.250 11:49:49 event -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:08:36.250 11:49:49 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:36.250 11:49:49 event -- common/autotest_common.sh@10 -- # set +x 00:08:36.250 ************************************ 00:08:36.250 START TEST event_reactor_perf 00:08:36.250 ************************************ 00:08:36.250 11:49:49 event.event_reactor_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:08:36.250 [2024-07-15 11:49:49.625324] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:36.250 [2024-07-15 11:49:49.625399] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1422888 ] 00:08:36.250 [2024-07-15 11:49:49.754125] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.509 [2024-07-15 11:49:49.854204] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.443 test_start 00:08:37.443 test_end 00:08:37.443 Performance: 327248 events per second 00:08:37.443 00:08:37.443 real 0m1.346s 00:08:37.443 user 0m1.206s 00:08:37.443 sys 0m0.132s 00:08:37.443 11:49:50 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:37.443 11:49:50 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:08:37.443 ************************************ 00:08:37.443 END TEST event_reactor_perf 00:08:37.443 ************************************ 00:08:37.443 11:49:50 event -- common/autotest_common.sh@1142 -- # return 0 00:08:37.443 11:49:50 event -- event/event.sh@49 -- # uname -s 00:08:37.443 11:49:50 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:08:37.443 11:49:50 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:37.443 11:49:50 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:37.443 11:49:50 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:37.443 11:49:50 event -- common/autotest_common.sh@10 -- # set +x 00:08:37.443 ************************************ 00:08:37.443 START TEST event_scheduler 00:08:37.443 ************************************ 00:08:37.443 11:49:51 event.event_scheduler -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:08:37.703 * Looking for test storage... 00:08:37.703 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:08:37.703 11:49:51 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:08:37.703 11:49:51 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=1423175 00:08:37.703 11:49:51 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:08:37.703 11:49:51 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:08:37.703 11:49:51 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 1423175 00:08:37.703 11:49:51 event.event_scheduler -- common/autotest_common.sh@829 -- # '[' -z 1423175 ']' 00:08:37.703 11:49:51 event.event_scheduler -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:37.703 11:49:51 event.event_scheduler -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:37.703 11:49:51 event.event_scheduler -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:37.703 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:37.703 11:49:51 event.event_scheduler -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:37.703 11:49:51 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:37.703 [2024-07-15 11:49:51.203853] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:37.703 [2024-07-15 11:49:51.203925] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1423175 ] 00:08:37.963 [2024-07-15 11:49:51.397450] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:38.221 [2024-07-15 11:49:51.583930] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.221 [2024-07-15 11:49:51.584021] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:38.221 [2024-07-15 11:49:51.584125] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:38.221 [2024-07-15 11:49:51.584137] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@862 -- # return 0 00:08:38.789 11:49:52 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:38.789 [2024-07-15 11:49:52.159459] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:08:38.789 [2024-07-15 11:49:52.159522] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:08:38.789 [2024-07-15 11:49:52.159557] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:08:38.789 [2024-07-15 11:49:52.159583] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:08:38.789 [2024-07-15 11:49:52.159608] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.789 11:49:52 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:38.789 [2024-07-15 11:49:52.291020] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.789 11:49:52 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:38.789 11:49:52 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:38.789 ************************************ 00:08:38.789 START TEST scheduler_create_thread 00:08:38.789 ************************************ 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1123 -- # scheduler_create_thread 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:38.789 2 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:38.789 3 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:38.789 4 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:38.789 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:39.049 5 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:39.049 6 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:39.049 7 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.049 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:39.050 8 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:39.050 9 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:39.050 10 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.050 11:49:52 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:39.994 11:49:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:39.994 11:49:53 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:08:39.994 11:49:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:39.994 11:49:53 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:41.374 11:49:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:41.374 11:49:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:08:41.374 11:49:54 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:08:41.374 11:49:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@559 -- # xtrace_disable 00:08:41.374 11:49:54 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:42.311 11:49:55 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:08:42.311 00:08:42.311 real 0m3.386s 00:08:42.311 user 0m0.025s 00:08:42.311 sys 0m0.007s 00:08:42.311 11:49:55 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:42.311 11:49:55 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:08:42.311 ************************************ 00:08:42.311 END TEST scheduler_create_thread 00:08:42.311 ************************************ 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@1142 -- # return 0 00:08:42.311 11:49:55 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:08:42.311 11:49:55 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 1423175 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@948 -- # '[' -z 1423175 ']' 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@952 -- # kill -0 1423175 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@953 -- # uname 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1423175 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@954 -- # process_name=reactor_2 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@958 -- # '[' reactor_2 = sudo ']' 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1423175' 00:08:42.311 killing process with pid 1423175 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@967 -- # kill 1423175 00:08:42.311 11:49:55 event.event_scheduler -- common/autotest_common.sh@972 -- # wait 1423175 00:08:42.570 [2024-07-15 11:49:56.102210] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:08:43.139 00:08:43.139 real 0m5.440s 00:08:43.139 user 0m10.497s 00:08:43.139 sys 0m0.627s 00:08:43.139 11:49:56 event.event_scheduler -- common/autotest_common.sh@1124 -- # xtrace_disable 00:08:43.139 11:49:56 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:08:43.139 ************************************ 00:08:43.139 END TEST event_scheduler 00:08:43.139 ************************************ 00:08:43.139 11:49:56 event -- common/autotest_common.sh@1142 -- # return 0 00:08:43.139 11:49:56 event -- event/event.sh@51 -- # modprobe -n nbd 00:08:43.139 11:49:56 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:08:43.139 11:49:56 event -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:08:43.139 11:49:56 event -- common/autotest_common.sh@1105 -- # xtrace_disable 00:08:43.139 11:49:56 event -- common/autotest_common.sh@10 -- # set +x 00:08:43.139 ************************************ 00:08:43.139 START TEST app_repeat 00:08:43.139 ************************************ 00:08:43.139 11:49:56 event.app_repeat -- common/autotest_common.sh@1123 -- # app_repeat_test 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@19 -- # repeat_pid=1423924 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 1423924' 00:08:43.139 Process app_repeat pid: 1423924 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:08:43.139 spdk_app_start Round 0 00:08:43.139 11:49:56 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1423924 /var/tmp/spdk-nbd.sock 00:08:43.139 11:49:56 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1423924 ']' 00:08:43.139 11:49:56 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:43.139 11:49:56 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:43.139 11:49:56 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:43.139 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:43.139 11:49:56 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:43.139 11:49:56 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:43.139 [2024-07-15 11:49:56.622275] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:08:43.139 [2024-07-15 11:49:56.622406] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1423924 ] 00:08:43.399 [2024-07-15 11:49:56.824667] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:43.399 [2024-07-15 11:49:56.935710] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:43.399 [2024-07-15 11:49:56.935713] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.657 11:49:57 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:43.657 11:49:57 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:43.657 11:49:57 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:43.916 Malloc0 00:08:43.916 11:49:57 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:44.175 Malloc1 00:08:44.175 11:49:57 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:44.175 11:49:57 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:44.435 /dev/nbd0 00:08:44.435 11:49:57 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:44.435 11:49:57 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:44.435 1+0 records in 00:08:44.435 1+0 records out 00:08:44.435 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267626 s, 15.3 MB/s 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.435 11:49:57 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:44.435 11:49:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.435 11:49:57 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:44.435 11:49:57 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:44.694 /dev/nbd1 00:08:44.694 11:49:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:44.694 11:49:58 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:44.694 1+0 records in 00:08:44.694 1+0 records out 00:08:44.694 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245403 s, 16.7 MB/s 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:44.694 11:49:58 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:44.694 11:49:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:44.694 11:49:58 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:44.694 11:49:58 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:44.694 11:49:58 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:44.694 11:49:58 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:44.954 { 00:08:44.954 "nbd_device": "/dev/nbd0", 00:08:44.954 "bdev_name": "Malloc0" 00:08:44.954 }, 00:08:44.954 { 00:08:44.954 "nbd_device": "/dev/nbd1", 00:08:44.954 "bdev_name": "Malloc1" 00:08:44.954 } 00:08:44.954 ]' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:44.954 { 00:08:44.954 "nbd_device": "/dev/nbd0", 00:08:44.954 "bdev_name": "Malloc0" 00:08:44.954 }, 00:08:44.954 { 00:08:44.954 "nbd_device": "/dev/nbd1", 00:08:44.954 "bdev_name": "Malloc1" 00:08:44.954 } 00:08:44.954 ]' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:44.954 /dev/nbd1' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:44.954 /dev/nbd1' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:44.954 256+0 records in 00:08:44.954 256+0 records out 00:08:44.954 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0105403 s, 99.5 MB/s 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:44.954 256+0 records in 00:08:44.954 256+0 records out 00:08:44.954 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0301812 s, 34.7 MB/s 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:44.954 256+0 records in 00:08:44.954 256+0 records out 00:08:44.954 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0311321 s, 33.7 MB/s 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:44.954 11:49:58 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:45.213 11:49:58 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:45.213 11:49:58 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:45.213 11:49:58 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:45.213 11:49:58 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:45.472 11:49:58 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:45.731 11:49:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:45.990 11:49:59 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:45.990 11:49:59 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:46.250 11:49:59 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:46.509 [2024-07-15 11:49:59.929870] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:46.509 [2024-07-15 11:50:00.026862] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:46.509 [2024-07-15 11:50:00.026867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:46.509 [2024-07-15 11:50:00.073063] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:46.509 [2024-07-15 11:50:00.073115] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:49.814 11:50:02 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:49.814 11:50:02 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:08:49.814 spdk_app_start Round 1 00:08:49.814 11:50:02 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1423924 /var/tmp/spdk-nbd.sock 00:08:49.814 11:50:02 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1423924 ']' 00:08:49.814 11:50:02 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:49.814 11:50:02 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:49.814 11:50:02 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:49.814 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:49.814 11:50:02 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:49.814 11:50:02 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:49.814 11:50:02 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:49.814 11:50:02 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:49.814 11:50:02 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:49.814 Malloc0 00:08:49.814 11:50:03 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:50.073 Malloc1 00:08:50.073 11:50:03 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:50.073 11:50:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:50.331 /dev/nbd0 00:08:50.331 11:50:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:50.331 11:50:03 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:50.331 11:50:03 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:50.331 11:50:03 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:50.331 11:50:03 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.331 11:50:03 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.331 11:50:03 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:50.331 11:50:03 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:50.331 11:50:03 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.331 11:50:03 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.331 11:50:03 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:50.332 1+0 records in 00:08:50.332 1+0 records out 00:08:50.332 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024086 s, 17.0 MB/s 00:08:50.332 11:50:03 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:50.332 11:50:03 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:50.332 11:50:03 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:50.332 11:50:03 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.332 11:50:03 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:50.332 11:50:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:50.332 11:50:03 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:50.332 11:50:03 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:50.589 /dev/nbd1 00:08:50.589 11:50:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:50.589 11:50:04 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:50.589 1+0 records in 00:08:50.589 1+0 records out 00:08:50.589 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251406 s, 16.3 MB/s 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:50.589 11:50:04 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:50.589 11:50:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:50.589 11:50:04 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:50.589 11:50:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:50.589 11:50:04 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:50.589 11:50:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:50.847 { 00:08:50.847 "nbd_device": "/dev/nbd0", 00:08:50.847 "bdev_name": "Malloc0" 00:08:50.847 }, 00:08:50.847 { 00:08:50.847 "nbd_device": "/dev/nbd1", 00:08:50.847 "bdev_name": "Malloc1" 00:08:50.847 } 00:08:50.847 ]' 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:50.847 { 00:08:50.847 "nbd_device": "/dev/nbd0", 00:08:50.847 "bdev_name": "Malloc0" 00:08:50.847 }, 00:08:50.847 { 00:08:50.847 "nbd_device": "/dev/nbd1", 00:08:50.847 "bdev_name": "Malloc1" 00:08:50.847 } 00:08:50.847 ]' 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:50.847 /dev/nbd1' 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:50.847 /dev/nbd1' 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:50.847 256+0 records in 00:08:50.847 256+0 records out 00:08:50.847 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103079 s, 102 MB/s 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:50.847 11:50:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:51.104 256+0 records in 00:08:51.104 256+0 records out 00:08:51.104 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0297673 s, 35.2 MB/s 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:51.104 256+0 records in 00:08:51.104 256+0 records out 00:08:51.104 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0312435 s, 33.6 MB/s 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.104 11:50:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:51.361 11:50:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:51.362 11:50:04 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:51.362 11:50:04 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:51.362 11:50:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.362 11:50:04 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.362 11:50:04 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:51.362 11:50:04 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:51.362 11:50:04 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.362 11:50:04 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:51.362 11:50:04 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:51.619 11:50:05 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:51.879 11:50:05 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:51.879 11:50:05 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:52.137 11:50:05 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:52.396 [2024-07-15 11:50:05.826145] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:52.396 [2024-07-15 11:50:05.924584] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:52.396 [2024-07-15 11:50:05.924588] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:52.396 [2024-07-15 11:50:05.978117] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:52.396 [2024-07-15 11:50:05.978168] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:55.683 11:50:08 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:55.683 11:50:08 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:08:55.683 spdk_app_start Round 2 00:08:55.683 11:50:08 event.app_repeat -- event/event.sh@25 -- # waitforlisten 1423924 /var/tmp/spdk-nbd.sock 00:08:55.683 11:50:08 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1423924 ']' 00:08:55.683 11:50:08 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:55.683 11:50:08 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:08:55.683 11:50:08 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:55.683 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:55.683 11:50:08 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:08:55.683 11:50:08 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:55.683 11:50:08 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:08:55.683 11:50:08 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:08:55.683 11:50:08 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:55.683 Malloc0 00:08:55.683 11:50:09 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:55.942 Malloc1 00:08:55.942 11:50:09 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:55.942 11:50:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:56.201 /dev/nbd0 00:08:56.201 11:50:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:56.201 11:50:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:56.201 1+0 records in 00:08:56.201 1+0 records out 00:08:56.201 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250578 s, 16.3 MB/s 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:56.201 11:50:09 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:56.201 11:50:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.201 11:50:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:56.201 11:50:09 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:56.461 /dev/nbd1 00:08:56.461 11:50:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:56.461 11:50:09 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@867 -- # local i 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@871 -- # break 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:56.462 1+0 records in 00:08:56.462 1+0 records out 00:08:56.462 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00024893 s, 16.5 MB/s 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@884 -- # size=4096 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:08:56.462 11:50:09 event.app_repeat -- common/autotest_common.sh@887 -- # return 0 00:08:56.462 11:50:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:56.462 11:50:09 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:56.462 11:50:09 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:56.462 11:50:09 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:56.462 11:50:09 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:56.721 { 00:08:56.721 "nbd_device": "/dev/nbd0", 00:08:56.721 "bdev_name": "Malloc0" 00:08:56.721 }, 00:08:56.721 { 00:08:56.721 "nbd_device": "/dev/nbd1", 00:08:56.721 "bdev_name": "Malloc1" 00:08:56.721 } 00:08:56.721 ]' 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:56.721 { 00:08:56.721 "nbd_device": "/dev/nbd0", 00:08:56.721 "bdev_name": "Malloc0" 00:08:56.721 }, 00:08:56.721 { 00:08:56.721 "nbd_device": "/dev/nbd1", 00:08:56.721 "bdev_name": "Malloc1" 00:08:56.721 } 00:08:56.721 ]' 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:56.721 /dev/nbd1' 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:56.721 /dev/nbd1' 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:56.721 256+0 records in 00:08:56.721 256+0 records out 00:08:56.721 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00680524 s, 154 MB/s 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:56.721 256+0 records in 00:08:56.721 256+0 records out 00:08:56.721 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0200268 s, 52.4 MB/s 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:56.721 11:50:10 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:56.980 256+0 records in 00:08:56.980 256+0 records out 00:08:56.980 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0315406 s, 33.2 MB/s 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:56.980 11:50:10 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:57.239 11:50:10 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:57.497 11:50:10 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:57.755 11:50:11 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:57.755 11:50:11 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:58.013 11:50:11 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:58.272 [2024-07-15 11:50:11.742117] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:58.272 [2024-07-15 11:50:11.840170] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:58.272 [2024-07-15 11:50:11.840175] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.530 [2024-07-15 11:50:11.892315] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:58.530 [2024-07-15 11:50:11.892365] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:09:01.061 11:50:14 event.app_repeat -- event/event.sh@38 -- # waitforlisten 1423924 /var/tmp/spdk-nbd.sock 00:09:01.061 11:50:14 event.app_repeat -- common/autotest_common.sh@829 -- # '[' -z 1423924 ']' 00:09:01.061 11:50:14 event.app_repeat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:01.061 11:50:14 event.app_repeat -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:01.061 11:50:14 event.app_repeat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:01.061 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:01.061 11:50:14 event.app_repeat -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:01.061 11:50:14 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@862 -- # return 0 00:09:01.320 11:50:14 event.app_repeat -- event/event.sh@39 -- # killprocess 1423924 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@948 -- # '[' -z 1423924 ']' 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@952 -- # kill -0 1423924 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@953 -- # uname 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1423924 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1423924' 00:09:01.320 killing process with pid 1423924 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@967 -- # kill 1423924 00:09:01.320 11:50:14 event.app_repeat -- common/autotest_common.sh@972 -- # wait 1423924 00:09:01.579 spdk_app_start is called in Round 0. 00:09:01.579 Shutdown signal received, stop current app iteration 00:09:01.579 Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 reinitialization... 00:09:01.579 spdk_app_start is called in Round 1. 00:09:01.579 Shutdown signal received, stop current app iteration 00:09:01.579 Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 reinitialization... 00:09:01.579 spdk_app_start is called in Round 2. 00:09:01.579 Shutdown signal received, stop current app iteration 00:09:01.579 Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 reinitialization... 00:09:01.579 spdk_app_start is called in Round 3. 00:09:01.579 Shutdown signal received, stop current app iteration 00:09:01.579 11:50:15 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:09:01.579 11:50:15 event.app_repeat -- event/event.sh@42 -- # return 0 00:09:01.579 00:09:01.579 real 0m18.482s 00:09:01.579 user 0m40.105s 00:09:01.579 sys 0m3.942s 00:09:01.579 11:50:15 event.app_repeat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:01.579 11:50:15 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:09:01.579 ************************************ 00:09:01.579 END TEST app_repeat 00:09:01.579 ************************************ 00:09:01.579 11:50:15 event -- common/autotest_common.sh@1142 -- # return 0 00:09:01.579 11:50:15 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:09:01.579 00:09:01.579 real 0m28.512s 00:09:01.579 user 0m57.436s 00:09:01.579 sys 0m5.344s 00:09:01.579 11:50:15 event -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:01.579 11:50:15 event -- common/autotest_common.sh@10 -- # set +x 00:09:01.579 ************************************ 00:09:01.579 END TEST event 00:09:01.579 ************************************ 00:09:01.579 11:50:15 -- common/autotest_common.sh@1142 -- # return 0 00:09:01.579 11:50:15 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:09:01.579 11:50:15 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:01.579 11:50:15 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:01.579 11:50:15 -- common/autotest_common.sh@10 -- # set +x 00:09:01.579 ************************************ 00:09:01.579 START TEST thread 00:09:01.579 ************************************ 00:09:01.579 11:50:15 thread -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:09:01.837 * Looking for test storage... 00:09:01.837 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:09:01.837 11:50:15 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:09:01.837 11:50:15 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:01.837 11:50:15 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:01.837 11:50:15 thread -- common/autotest_common.sh@10 -- # set +x 00:09:01.837 ************************************ 00:09:01.837 START TEST thread_poller_perf 00:09:01.837 ************************************ 00:09:01.837 11:50:15 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:09:01.837 [2024-07-15 11:50:15.349009] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:01.837 [2024-07-15 11:50:15.349077] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1426521 ] 00:09:02.096 [2024-07-15 11:50:15.476203] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:02.096 [2024-07-15 11:50:15.577272] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:02.096 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:09:03.099 ====================================== 00:09:03.099 busy:2315147846 (cyc) 00:09:03.099 total_run_count: 267000 00:09:03.099 tsc_hz: 2300000000 (cyc) 00:09:03.099 ====================================== 00:09:03.099 poller_cost: 8670 (cyc), 3769 (nsec) 00:09:03.099 00:09:03.099 real 0m1.363s 00:09:03.099 user 0m1.217s 00:09:03.099 sys 0m0.139s 00:09:03.099 11:50:16 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:03.099 11:50:16 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:09:03.099 ************************************ 00:09:03.099 END TEST thread_poller_perf 00:09:03.099 ************************************ 00:09:03.358 11:50:16 thread -- common/autotest_common.sh@1142 -- # return 0 00:09:03.358 11:50:16 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:09:03.358 11:50:16 thread -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:03.358 11:50:16 thread -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:03.358 11:50:16 thread -- common/autotest_common.sh@10 -- # set +x 00:09:03.358 ************************************ 00:09:03.358 START TEST thread_poller_perf 00:09:03.358 ************************************ 00:09:03.358 11:50:16 thread.thread_poller_perf -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:09:03.358 [2024-07-15 11:50:16.793836] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:03.358 [2024-07-15 11:50:16.793898] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1426756 ] 00:09:03.358 [2024-07-15 11:50:16.923043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.616 [2024-07-15 11:50:17.023249] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.616 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:09:04.551 ====================================== 00:09:04.551 busy:2302819412 (cyc) 00:09:04.551 total_run_count: 3494000 00:09:04.551 tsc_hz: 2300000000 (cyc) 00:09:04.551 ====================================== 00:09:04.551 poller_cost: 659 (cyc), 286 (nsec) 00:09:04.551 00:09:04.551 real 0m1.350s 00:09:04.551 user 0m1.209s 00:09:04.551 sys 0m0.135s 00:09:04.551 11:50:18 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.551 11:50:18 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:09:04.551 ************************************ 00:09:04.551 END TEST thread_poller_perf 00:09:04.551 ************************************ 00:09:04.809 11:50:18 thread -- common/autotest_common.sh@1142 -- # return 0 00:09:04.809 11:50:18 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:09:04.809 00:09:04.809 real 0m2.990s 00:09:04.809 user 0m2.515s 00:09:04.809 sys 0m0.483s 00:09:04.809 11:50:18 thread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:04.809 11:50:18 thread -- common/autotest_common.sh@10 -- # set +x 00:09:04.809 ************************************ 00:09:04.809 END TEST thread 00:09:04.809 ************************************ 00:09:04.809 11:50:18 -- common/autotest_common.sh@1142 -- # return 0 00:09:04.809 11:50:18 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:09:04.809 11:50:18 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:04.809 11:50:18 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:04.809 11:50:18 -- common/autotest_common.sh@10 -- # set +x 00:09:04.809 ************************************ 00:09:04.809 START TEST accel 00:09:04.809 ************************************ 00:09:04.809 11:50:18 accel -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:09:04.809 * Looking for test storage... 00:09:04.809 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:09:04.809 11:50:18 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:09:04.809 11:50:18 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:09:04.809 11:50:18 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:04.809 11:50:18 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1427051 00:09:04.809 11:50:18 accel -- accel/accel.sh@63 -- # waitforlisten 1427051 00:09:04.809 11:50:18 accel -- common/autotest_common.sh@829 -- # '[' -z 1427051 ']' 00:09:04.809 11:50:18 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:04.809 11:50:18 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:09:04.809 11:50:18 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:04.809 11:50:18 accel -- accel/accel.sh@61 -- # build_accel_config 00:09:04.809 11:50:18 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:04.809 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:04.809 11:50:18 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:04.809 11:50:18 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:04.809 11:50:18 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:04.809 11:50:18 accel -- common/autotest_common.sh@10 -- # set +x 00:09:04.809 11:50:18 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:04.809 11:50:18 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:04.809 11:50:18 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:04.809 11:50:18 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:04.809 11:50:18 accel -- accel/accel.sh@41 -- # jq -r . 00:09:05.068 [2024-07-15 11:50:18.430811] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:05.068 [2024-07-15 11:50:18.430887] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1427051 ] 00:09:05.068 [2024-07-15 11:50:18.559946] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:05.068 [2024-07-15 11:50:18.658354] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@862 -- # return 0 00:09:06.013 11:50:19 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:09:06.013 11:50:19 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:09:06.013 11:50:19 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:09:06.013 11:50:19 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:09:06.013 11:50:19 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:09:06.013 11:50:19 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:06.013 11:50:19 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@10 -- # set +x 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # IFS== 00:09:06.013 11:50:19 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:06.013 11:50:19 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:06.013 11:50:19 accel -- accel/accel.sh@75 -- # killprocess 1427051 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@948 -- # '[' -z 1427051 ']' 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@952 -- # kill -0 1427051 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@953 -- # uname 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1427051 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1427051' 00:09:06.013 killing process with pid 1427051 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@967 -- # kill 1427051 00:09:06.013 11:50:19 accel -- common/autotest_common.sh@972 -- # wait 1427051 00:09:06.581 11:50:19 accel -- accel/accel.sh@76 -- # trap - ERR 00:09:06.581 11:50:19 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:09:06.581 11:50:19 accel -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:09:06.581 11:50:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:06.581 11:50:19 accel -- common/autotest_common.sh@10 -- # set +x 00:09:06.581 11:50:19 accel.accel_help -- common/autotest_common.sh@1123 -- # accel_perf -h 00:09:06.581 11:50:19 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:09:06.581 11:50:19 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:09:06.581 11:50:19 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:06.581 11:50:19 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:06.581 11:50:19 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:06.581 11:50:19 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:06.581 11:50:19 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:06.581 11:50:19 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:09:06.581 11:50:19 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:09:06.581 11:50:19 accel.accel_help -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:06.581 11:50:19 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:09:06.581 11:50:19 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:06.581 11:50:19 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:09:06.581 11:50:19 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:06.581 11:50:19 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:06.581 11:50:19 accel -- common/autotest_common.sh@10 -- # set +x 00:09:06.581 ************************************ 00:09:06.581 START TEST accel_missing_filename 00:09:06.581 ************************************ 00:09:06.581 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress 00:09:06.581 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@648 -- # local es=0 00:09:06.581 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress 00:09:06.581 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:06.581 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:06.581 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:06.581 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:06.581 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress 00:09:06.581 11:50:20 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:09:06.581 11:50:20 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:09:06.581 11:50:20 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:06.581 11:50:20 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:06.581 11:50:20 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:06.581 11:50:20 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:06.581 11:50:20 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:06.581 11:50:20 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:09:06.581 11:50:20 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:09:06.581 [2024-07-15 11:50:20.056038] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:06.581 [2024-07-15 11:50:20.056107] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1427272 ] 00:09:06.841 [2024-07-15 11:50:20.185901] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:06.841 [2024-07-15 11:50:20.285284] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.841 [2024-07-15 11:50:20.350168] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:06.841 [2024-07-15 11:50:20.422375] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:09:07.101 A filename is required. 00:09:07.101 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # es=234 00:09:07.101 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:07.101 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # es=106 00:09:07.101 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # case "$es" in 00:09:07.101 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@668 -- # es=1 00:09:07.101 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:07.101 00:09:07.101 real 0m0.499s 00:09:07.101 user 0m0.338s 00:09:07.101 sys 0m0.191s 00:09:07.101 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.101 11:50:20 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:09:07.101 ************************************ 00:09:07.101 END TEST accel_missing_filename 00:09:07.101 ************************************ 00:09:07.101 11:50:20 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:07.101 11:50:20 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:07.101 11:50:20 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:09:07.101 11:50:20 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.101 11:50:20 accel -- common/autotest_common.sh@10 -- # set +x 00:09:07.101 ************************************ 00:09:07.101 START TEST accel_compress_verify 00:09:07.101 ************************************ 00:09:07.101 11:50:20 accel.accel_compress_verify -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:07.101 11:50:20 accel.accel_compress_verify -- common/autotest_common.sh@648 -- # local es=0 00:09:07.101 11:50:20 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:07.101 11:50:20 accel.accel_compress_verify -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:07.101 11:50:20 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:07.101 11:50:20 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:07.101 11:50:20 accel.accel_compress_verify -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:07.101 11:50:20 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:07.101 11:50:20 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:07.101 11:50:20 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:09:07.101 11:50:20 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:07.101 11:50:20 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:07.101 11:50:20 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:07.101 11:50:20 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:07.101 11:50:20 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:07.101 11:50:20 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:09:07.101 11:50:20 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:09:07.101 [2024-07-15 11:50:20.641548] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:07.101 [2024-07-15 11:50:20.641617] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1427303 ] 00:09:07.361 [2024-07-15 11:50:20.773679] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:07.361 [2024-07-15 11:50:20.879079] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:07.361 [2024-07-15 11:50:20.947654] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:07.621 [2024-07-15 11:50:21.021634] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:09:07.621 00:09:07.621 Compression does not support the verify option, aborting. 00:09:07.621 11:50:21 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # es=161 00:09:07.621 11:50:21 accel.accel_compress_verify -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:07.621 11:50:21 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # es=33 00:09:07.621 11:50:21 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # case "$es" in 00:09:07.621 11:50:21 accel.accel_compress_verify -- common/autotest_common.sh@668 -- # es=1 00:09:07.621 11:50:21 accel.accel_compress_verify -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:07.621 00:09:07.621 real 0m0.514s 00:09:07.621 user 0m0.338s 00:09:07.621 sys 0m0.204s 00:09:07.621 11:50:21 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.621 11:50:21 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:09:07.621 ************************************ 00:09:07.621 END TEST accel_compress_verify 00:09:07.621 ************************************ 00:09:07.621 11:50:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:07.621 11:50:21 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:09:07.621 11:50:21 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:07.621 11:50:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.621 11:50:21 accel -- common/autotest_common.sh@10 -- # set +x 00:09:07.621 ************************************ 00:09:07.621 START TEST accel_wrong_workload 00:09:07.621 ************************************ 00:09:07.621 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w foobar 00:09:07.621 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@648 -- # local es=0 00:09:07.621 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:09:07.621 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:07.621 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:07.621 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:07.622 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:07.622 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w foobar 00:09:07.622 11:50:21 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:09:07.622 11:50:21 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:09:07.622 11:50:21 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:07.622 11:50:21 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:07.622 11:50:21 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:07.622 11:50:21 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:07.622 11:50:21 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:07.622 11:50:21 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:09:07.622 11:50:21 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:09:07.881 Unsupported workload type: foobar 00:09:07.881 [2024-07-15 11:50:21.237981] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:09:07.881 accel_perf options: 00:09:07.881 [-h help message] 00:09:07.881 [-q queue depth per core] 00:09:07.881 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:09:07.881 [-T number of threads per core 00:09:07.881 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:09:07.881 [-t time in seconds] 00:09:07.881 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:09:07.881 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:09:07.881 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:09:07.881 [-l for compress/decompress workloads, name of uncompressed input file 00:09:07.881 [-S for crc32c workload, use this seed value (default 0) 00:09:07.881 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:09:07.881 [-f for fill workload, use this BYTE value (default 255) 00:09:07.881 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:09:07.881 [-y verify result if this switch is on] 00:09:07.881 [-a tasks to allocate per core (default: same value as -q)] 00:09:07.881 Can be used to spread operations across a wider range of memory. 00:09:07.881 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # es=1 00:09:07.881 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:07.881 Error: writing output failed: Broken pipe 00:09:07.882 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:07.882 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:07.882 00:09:07.882 real 0m0.043s 00:09:07.882 user 0m0.055s 00:09:07.882 sys 0m0.026s 00:09:07.882 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.882 11:50:21 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:09:07.882 ************************************ 00:09:07.882 END TEST accel_wrong_workload 00:09:07.882 ************************************ 00:09:07.882 11:50:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:07.882 11:50:21 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:09:07.882 11:50:21 accel -- common/autotest_common.sh@1099 -- # '[' 10 -le 1 ']' 00:09:07.882 11:50:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.882 11:50:21 accel -- common/autotest_common.sh@10 -- # set +x 00:09:07.882 ************************************ 00:09:07.882 START TEST accel_negative_buffers 00:09:07.882 ************************************ 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@1123 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@648 -- # local es=0 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@636 -- # local arg=accel_perf 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # type -t accel_perf 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # accel_perf -t 1 -w xor -y -x -1 00:09:07.882 11:50:21 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:09:07.882 11:50:21 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:09:07.882 11:50:21 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:07.882 11:50:21 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:07.882 11:50:21 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:07.882 11:50:21 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:07.882 11:50:21 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:07.882 11:50:21 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:09:07.882 11:50:21 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:09:07.882 -x option must be non-negative. 00:09:07.882 [2024-07-15 11:50:21.366737] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:09:07.882 accel_perf options: 00:09:07.882 [-h help message] 00:09:07.882 [-q queue depth per core] 00:09:07.882 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:09:07.882 [-T number of threads per core 00:09:07.882 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:09:07.882 [-t time in seconds] 00:09:07.882 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:09:07.882 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:09:07.882 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:09:07.882 [-l for compress/decompress workloads, name of uncompressed input file 00:09:07.882 [-S for crc32c workload, use this seed value (default 0) 00:09:07.882 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:09:07.882 [-f for fill workload, use this BYTE value (default 255) 00:09:07.882 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:09:07.882 [-y verify result if this switch is on] 00:09:07.882 [-a tasks to allocate per core (default: same value as -q)] 00:09:07.882 Can be used to spread operations across a wider range of memory. 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # es=1 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:09:07.882 00:09:07.882 real 0m0.042s 00:09:07.882 user 0m0.020s 00:09:07.882 sys 0m0.022s 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:07.882 11:50:21 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:09:07.882 ************************************ 00:09:07.882 END TEST accel_negative_buffers 00:09:07.882 ************************************ 00:09:07.882 Error: writing output failed: Broken pipe 00:09:07.882 11:50:21 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:07.882 11:50:21 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:09:07.882 11:50:21 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:07.882 11:50:21 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:07.882 11:50:21 accel -- common/autotest_common.sh@10 -- # set +x 00:09:07.882 ************************************ 00:09:07.882 START TEST accel_crc32c 00:09:07.882 ************************************ 00:09:07.882 11:50:21 accel.accel_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -S 32 -y 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:09:07.882 11:50:21 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:09:08.141 [2024-07-15 11:50:21.486493] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:08.141 [2024-07-15 11:50:21.486559] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1427525 ] 00:09:08.141 [2024-07-15 11:50:21.616751] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:08.141 [2024-07-15 11:50:21.716867] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:08.401 11:50:21 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:09:09.782 11:50:22 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:09.782 00:09:09.782 real 0m1.496s 00:09:09.782 user 0m1.310s 00:09:09.782 sys 0m0.188s 00:09:09.782 11:50:22 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:09.782 11:50:22 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:09:09.782 ************************************ 00:09:09.782 END TEST accel_crc32c 00:09:09.782 ************************************ 00:09:09.782 11:50:22 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:09.782 11:50:22 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:09:09.782 11:50:22 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:09.782 11:50:22 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:09.782 11:50:22 accel -- common/autotest_common.sh@10 -- # set +x 00:09:09.782 ************************************ 00:09:09.782 START TEST accel_crc32c_C2 00:09:09.782 ************************************ 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w crc32c -y -C 2 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:09:09.782 [2024-07-15 11:50:23.065076] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:09.782 [2024-07-15 11:50:23.065139] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1427727 ] 00:09:09.782 [2024-07-15 11:50:23.193678] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:09.782 [2024-07-15 11:50:23.297104] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:09.782 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.042 11:50:23 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:10.980 00:09:10.980 real 0m1.505s 00:09:10.980 user 0m1.302s 00:09:10.980 sys 0m0.210s 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:10.980 11:50:24 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:09:10.980 ************************************ 00:09:10.980 END TEST accel_crc32c_C2 00:09:10.980 ************************************ 00:09:11.240 11:50:24 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:11.240 11:50:24 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:09:11.240 11:50:24 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:11.240 11:50:24 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:11.240 11:50:24 accel -- common/autotest_common.sh@10 -- # set +x 00:09:11.240 ************************************ 00:09:11.240 START TEST accel_copy 00:09:11.240 ************************************ 00:09:11.240 11:50:24 accel.accel_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy -y 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:09:11.240 11:50:24 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:09:11.240 [2024-07-15 11:50:24.653281] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:11.240 [2024-07-15 11:50:24.653343] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1427920 ] 00:09:11.240 [2024-07-15 11:50:24.783347] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.500 [2024-07-15 11:50:24.890878] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:11.501 11:50:24 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:09:12.880 11:50:26 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:12.880 00:09:12.880 real 0m1.522s 00:09:12.880 user 0m1.323s 00:09:12.880 sys 0m0.199s 00:09:12.880 11:50:26 accel.accel_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:12.880 11:50:26 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:09:12.880 ************************************ 00:09:12.880 END TEST accel_copy 00:09:12.880 ************************************ 00:09:12.880 11:50:26 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:12.880 11:50:26 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:12.880 11:50:26 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:12.880 11:50:26 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:12.880 11:50:26 accel -- common/autotest_common.sh@10 -- # set +x 00:09:12.880 ************************************ 00:09:12.880 START TEST accel_fill 00:09:12.880 ************************************ 00:09:12.880 11:50:26 accel.accel_fill -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:09:12.880 11:50:26 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:09:12.880 [2024-07-15 11:50:26.270020] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:12.880 [2024-07-15 11:50:26.270146] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1428162 ] 00:09:12.880 [2024-07-15 11:50:26.466717] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:13.140 [2024-07-15 11:50:26.571946] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:13.140 11:50:26 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:09:14.521 11:50:27 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:14.521 00:09:14.521 real 0m1.598s 00:09:14.521 user 0m1.334s 00:09:14.521 sys 0m0.265s 00:09:14.521 11:50:27 accel.accel_fill -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:14.521 11:50:27 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:09:14.521 ************************************ 00:09:14.521 END TEST accel_fill 00:09:14.521 ************************************ 00:09:14.521 11:50:27 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:14.521 11:50:27 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:09:14.521 11:50:27 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:14.521 11:50:27 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:14.521 11:50:27 accel -- common/autotest_common.sh@10 -- # set +x 00:09:14.521 ************************************ 00:09:14.521 START TEST accel_copy_crc32c 00:09:14.521 ************************************ 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:09:14.521 11:50:27 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:09:14.521 [2024-07-15 11:50:27.943281] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:14.521 [2024-07-15 11:50:27.943344] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1428477 ] 00:09:14.521 [2024-07-15 11:50:28.061919] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:14.780 [2024-07-15 11:50:28.164316] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.780 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:14.781 11:50:28 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:16.161 00:09:16.161 real 0m1.495s 00:09:16.161 user 0m1.322s 00:09:16.161 sys 0m0.178s 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:16.161 11:50:29 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:09:16.161 ************************************ 00:09:16.161 END TEST accel_copy_crc32c 00:09:16.161 ************************************ 00:09:16.161 11:50:29 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:16.161 11:50:29 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:09:16.161 11:50:29 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:16.161 11:50:29 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:16.161 11:50:29 accel -- common/autotest_common.sh@10 -- # set +x 00:09:16.161 ************************************ 00:09:16.161 START TEST accel_copy_crc32c_C2 00:09:16.161 ************************************ 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:09:16.161 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:09:16.161 [2024-07-15 11:50:29.517249] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:16.161 [2024-07-15 11:50:29.517310] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1428675 ] 00:09:16.161 [2024-07-15 11:50:29.645614] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:16.161 [2024-07-15 11:50:29.744863] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:16.421 11:50:29 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:17.800 00:09:17.800 real 0m1.491s 00:09:17.800 user 0m1.315s 00:09:17.800 sys 0m0.178s 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:17.800 11:50:30 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:09:17.800 ************************************ 00:09:17.800 END TEST accel_copy_crc32c_C2 00:09:17.800 ************************************ 00:09:17.800 11:50:31 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:17.800 11:50:31 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:09:17.800 11:50:31 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:17.800 11:50:31 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:17.800 11:50:31 accel -- common/autotest_common.sh@10 -- # set +x 00:09:17.800 ************************************ 00:09:17.800 START TEST accel_dualcast 00:09:17.800 ************************************ 00:09:17.800 11:50:31 accel.accel_dualcast -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dualcast -y 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:09:17.800 11:50:31 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:09:17.800 [2024-07-15 11:50:31.090006] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:17.800 [2024-07-15 11:50:31.090068] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1428874 ] 00:09:17.800 [2024-07-15 11:50:31.218119] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:17.800 [2024-07-15 11:50:31.324613] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.060 11:50:31 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:09:18.998 11:50:32 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:18.998 00:09:18.998 real 0m1.508s 00:09:18.998 user 0m1.324s 00:09:18.998 sys 0m0.185s 00:09:18.998 11:50:32 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:18.998 11:50:32 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:09:18.998 ************************************ 00:09:18.998 END TEST accel_dualcast 00:09:18.998 ************************************ 00:09:19.258 11:50:32 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:19.258 11:50:32 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:09:19.258 11:50:32 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:19.258 11:50:32 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:19.258 11:50:32 accel -- common/autotest_common.sh@10 -- # set +x 00:09:19.258 ************************************ 00:09:19.258 START TEST accel_compare 00:09:19.258 ************************************ 00:09:19.258 11:50:32 accel.accel_compare -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compare -y 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:09:19.258 11:50:32 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:09:19.258 [2024-07-15 11:50:32.685229] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:19.258 [2024-07-15 11:50:32.685302] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1429069 ] 00:09:19.258 [2024-07-15 11:50:32.817175] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:19.517 [2024-07-15 11:50:32.923077] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:19.517 11:50:32 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.517 11:50:32 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:32 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:19.518 11:50:33 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:09:20.898 11:50:34 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:20.898 00:09:20.898 real 0m1.518s 00:09:20.898 user 0m1.318s 00:09:20.898 sys 0m0.208s 00:09:20.898 11:50:34 accel.accel_compare -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:20.898 11:50:34 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:09:20.898 ************************************ 00:09:20.898 END TEST accel_compare 00:09:20.898 ************************************ 00:09:20.898 11:50:34 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:20.899 11:50:34 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:09:20.899 11:50:34 accel -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:09:20.899 11:50:34 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:20.899 11:50:34 accel -- common/autotest_common.sh@10 -- # set +x 00:09:20.899 ************************************ 00:09:20.899 START TEST accel_xor 00:09:20.899 ************************************ 00:09:20.899 11:50:34 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:09:20.899 11:50:34 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:09:20.899 [2024-07-15 11:50:34.282419] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:20.899 [2024-07-15 11:50:34.282480] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1429268 ] 00:09:20.899 [2024-07-15 11:50:34.411502] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:21.159 [2024-07-15 11:50:34.515855] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:21.159 11:50:34 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:22.537 00:09:22.537 real 0m1.510s 00:09:22.537 user 0m1.337s 00:09:22.537 sys 0m0.179s 00:09:22.537 11:50:35 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:22.537 11:50:35 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:09:22.537 ************************************ 00:09:22.537 END TEST accel_xor 00:09:22.537 ************************************ 00:09:22.537 11:50:35 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:22.537 11:50:35 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:09:22.537 11:50:35 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:22.537 11:50:35 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:22.537 11:50:35 accel -- common/autotest_common.sh@10 -- # set +x 00:09:22.537 ************************************ 00:09:22.537 START TEST accel_xor 00:09:22.537 ************************************ 00:09:22.537 11:50:35 accel.accel_xor -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w xor -y -x 3 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:09:22.537 11:50:35 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:09:22.537 [2024-07-15 11:50:35.883628] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:22.537 [2024-07-15 11:50:35.883700] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1429523 ] 00:09:22.537 [2024-07-15 11:50:36.014266] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:22.537 [2024-07-15 11:50:36.115023] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:22.797 11:50:36 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:09:24.178 11:50:37 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:24.178 00:09:24.178 real 0m1.510s 00:09:24.178 user 0m1.310s 00:09:24.178 sys 0m0.205s 00:09:24.178 11:50:37 accel.accel_xor -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:24.178 11:50:37 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:09:24.178 ************************************ 00:09:24.178 END TEST accel_xor 00:09:24.178 ************************************ 00:09:24.178 11:50:37 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:24.178 11:50:37 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:09:24.178 11:50:37 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:24.178 11:50:37 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:24.178 11:50:37 accel -- common/autotest_common.sh@10 -- # set +x 00:09:24.178 ************************************ 00:09:24.178 START TEST accel_dif_verify 00:09:24.178 ************************************ 00:09:24.178 11:50:37 accel.accel_dif_verify -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_verify 00:09:24.178 11:50:37 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:09:24.178 11:50:37 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:09:24.178 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.178 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.178 11:50:37 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:09:24.178 11:50:37 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:09:24.178 11:50:37 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:09:24.179 [2024-07-15 11:50:37.479007] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:24.179 [2024-07-15 11:50:37.479069] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1429819 ] 00:09:24.179 [2024-07-15 11:50:37.606433] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:24.179 [2024-07-15 11:50:37.704262] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.179 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:24.467 11:50:37 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:09:25.403 11:50:38 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:25.403 00:09:25.403 real 0m1.488s 00:09:25.403 user 0m1.320s 00:09:25.403 sys 0m0.178s 00:09:25.403 11:50:38 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:25.403 11:50:38 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:09:25.403 ************************************ 00:09:25.403 END TEST accel_dif_verify 00:09:25.403 ************************************ 00:09:25.403 11:50:38 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:25.403 11:50:38 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:09:25.403 11:50:38 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:25.403 11:50:38 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:25.403 11:50:38 accel -- common/autotest_common.sh@10 -- # set +x 00:09:25.662 ************************************ 00:09:25.662 START TEST accel_dif_generate 00:09:25.662 ************************************ 00:09:25.662 11:50:39 accel.accel_dif_generate -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:09:25.662 11:50:39 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:09:25.662 [2024-07-15 11:50:39.048492] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:25.663 [2024-07-15 11:50:39.048551] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1430025 ] 00:09:25.663 [2024-07-15 11:50:39.176478] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:25.921 [2024-07-15 11:50:39.274413] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.921 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:25.922 11:50:39 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:09:27.298 11:50:40 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:27.298 00:09:27.298 real 0m1.481s 00:09:27.298 user 0m1.307s 00:09:27.298 sys 0m0.184s 00:09:27.298 11:50:40 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:27.298 11:50:40 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:09:27.298 ************************************ 00:09:27.298 END TEST accel_dif_generate 00:09:27.298 ************************************ 00:09:27.298 11:50:40 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:27.298 11:50:40 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:09:27.298 11:50:40 accel -- common/autotest_common.sh@1099 -- # '[' 6 -le 1 ']' 00:09:27.298 11:50:40 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:27.298 11:50:40 accel -- common/autotest_common.sh@10 -- # set +x 00:09:27.298 ************************************ 00:09:27.298 START TEST accel_dif_generate_copy 00:09:27.298 ************************************ 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w dif_generate_copy 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:09:27.298 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:09:27.298 [2024-07-15 11:50:40.623905] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:27.298 [2024-07-15 11:50:40.624030] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1430217 ] 00:09:27.298 [2024-07-15 11:50:40.817202] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:27.557 [2024-07-15 11:50:40.922967] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:40 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:27.557 11:50:41 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:28.934 00:09:28.934 real 0m1.590s 00:09:28.934 user 0m1.343s 00:09:28.934 sys 0m0.252s 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:28.934 11:50:42 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:09:28.934 ************************************ 00:09:28.934 END TEST accel_dif_generate_copy 00:09:28.934 ************************************ 00:09:28.934 11:50:42 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:28.934 11:50:42 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:09:28.934 11:50:42 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:28.934 11:50:42 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:28.934 11:50:42 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:28.934 11:50:42 accel -- common/autotest_common.sh@10 -- # set +x 00:09:28.934 ************************************ 00:09:28.934 START TEST accel_comp 00:09:28.934 ************************************ 00:09:28.934 11:50:42 accel.accel_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:09:28.934 11:50:42 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:09:28.934 [2024-07-15 11:50:42.279406] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:28.934 [2024-07-15 11:50:42.279467] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1430419 ] 00:09:28.934 [2024-07-15 11:50:42.407425] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:28.934 [2024-07-15 11:50:42.507465] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:29.194 11:50:42 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:09:30.631 11:50:43 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:30.631 00:09:30.631 real 0m1.517s 00:09:30.631 user 0m1.324s 00:09:30.631 sys 0m0.194s 00:09:30.631 11:50:43 accel.accel_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:30.631 11:50:43 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:09:30.631 ************************************ 00:09:30.631 END TEST accel_comp 00:09:30.631 ************************************ 00:09:30.631 11:50:43 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:30.631 11:50:43 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:30.631 11:50:43 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:30.631 11:50:43 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:30.631 11:50:43 accel -- common/autotest_common.sh@10 -- # set +x 00:09:30.631 ************************************ 00:09:30.631 START TEST accel_decomp 00:09:30.631 ************************************ 00:09:30.631 11:50:43 accel.accel_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:09:30.631 11:50:43 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:09:30.631 [2024-07-15 11:50:43.871138] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:30.631 [2024-07-15 11:50:43.871198] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1430610 ] 00:09:30.631 [2024-07-15 11:50:44.001580] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.631 [2024-07-15 11:50:44.103998] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:30.631 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:30.632 11:50:44 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:32.013 11:50:45 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:32.013 00:09:32.013 real 0m1.512s 00:09:32.013 user 0m1.319s 00:09:32.013 sys 0m0.198s 00:09:32.013 11:50:45 accel.accel_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:32.013 11:50:45 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:09:32.013 ************************************ 00:09:32.013 END TEST accel_decomp 00:09:32.013 ************************************ 00:09:32.013 11:50:45 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:32.013 11:50:45 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:32.013 11:50:45 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:32.013 11:50:45 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:32.013 11:50:45 accel -- common/autotest_common.sh@10 -- # set +x 00:09:32.013 ************************************ 00:09:32.013 START TEST accel_decomp_full 00:09:32.013 ************************************ 00:09:32.013 11:50:45 accel.accel_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:32.013 11:50:45 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:32.013 [2024-07-15 11:50:45.486350] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:32.014 [2024-07-15 11:50:45.486497] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1430873 ] 00:09:32.273 [2024-07-15 11:50:45.683666] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:32.273 [2024-07-15 11:50:45.787520] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.273 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.532 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.533 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.533 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.533 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:32.533 11:50:45 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:32.533 11:50:45 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:32.533 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:32.533 11:50:45 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:33.471 11:50:47 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:33.471 00:09:33.471 real 0m1.597s 00:09:33.471 user 0m1.353s 00:09:33.471 sys 0m0.252s 00:09:33.471 11:50:47 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:33.471 11:50:47 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:33.471 ************************************ 00:09:33.471 END TEST accel_decomp_full 00:09:33.471 ************************************ 00:09:33.731 11:50:47 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:33.731 11:50:47 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:33.731 11:50:47 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:33.731 11:50:47 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:33.731 11:50:47 accel -- common/autotest_common.sh@10 -- # set +x 00:09:33.731 ************************************ 00:09:33.731 START TEST accel_decomp_mcore 00:09:33.731 ************************************ 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:33.731 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:33.731 [2024-07-15 11:50:47.148465] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:33.731 [2024-07-15 11:50:47.148527] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1431166 ] 00:09:33.731 [2024-07-15 11:50:47.277336] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:33.990 [2024-07-15 11:50:47.377773] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:33.990 [2024-07-15 11:50:47.377872] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:33.990 [2024-07-15 11:50:47.377971] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:33.990 [2024-07-15 11:50:47.377971] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.990 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:33.991 11:50:47 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:35.371 00:09:35.371 real 0m1.506s 00:09:35.371 user 0m4.729s 00:09:35.371 sys 0m0.207s 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:35.371 11:50:48 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:35.371 ************************************ 00:09:35.371 END TEST accel_decomp_mcore 00:09:35.371 ************************************ 00:09:35.371 11:50:48 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:35.371 11:50:48 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:35.371 11:50:48 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:35.371 11:50:48 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:35.371 11:50:48 accel -- common/autotest_common.sh@10 -- # set +x 00:09:35.371 ************************************ 00:09:35.371 START TEST accel_decomp_full_mcore 00:09:35.371 ************************************ 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:35.371 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:35.372 11:50:48 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:35.372 [2024-07-15 11:50:48.732220] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:35.372 [2024-07-15 11:50:48.732282] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1431371 ] 00:09:35.372 [2024-07-15 11:50:48.861493] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:35.632 [2024-07-15 11:50:48.967257] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:35.632 [2024-07-15 11:50:48.967357] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:35.632 [2024-07-15 11:50:48.967456] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:35.632 [2024-07-15 11:50:48.967457] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.632 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:35.633 11:50:49 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:37.014 00:09:37.014 real 0m1.545s 00:09:37.014 user 0m4.862s 00:09:37.014 sys 0m0.211s 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:37.014 11:50:50 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:37.014 ************************************ 00:09:37.014 END TEST accel_decomp_full_mcore 00:09:37.014 ************************************ 00:09:37.014 11:50:50 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:37.014 11:50:50 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:37.014 11:50:50 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:37.014 11:50:50 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:37.014 11:50:50 accel -- common/autotest_common.sh@10 -- # set +x 00:09:37.014 ************************************ 00:09:37.014 START TEST accel_decomp_mthread 00:09:37.014 ************************************ 00:09:37.014 11:50:50 accel.accel_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:37.014 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:37.014 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:37.014 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.014 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.014 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:37.015 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:37.015 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:37.015 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:37.015 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:37.015 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:37.015 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:37.015 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:37.015 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:37.015 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:37.015 [2024-07-15 11:50:50.364567] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:37.015 [2024-07-15 11:50:50.364635] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1431572 ] 00:09:37.015 [2024-07-15 11:50:50.496507] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:37.015 [2024-07-15 11:50:50.600729] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:37.275 11:50:50 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:38.658 00:09:38.658 real 0m1.517s 00:09:38.658 user 0m1.332s 00:09:38.658 sys 0m0.192s 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:38.658 11:50:51 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:38.659 ************************************ 00:09:38.659 END TEST accel_decomp_mthread 00:09:38.659 ************************************ 00:09:38.659 11:50:51 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:38.659 11:50:51 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:38.659 11:50:51 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:38.659 11:50:51 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:38.659 11:50:51 accel -- common/autotest_common.sh@10 -- # set +x 00:09:38.659 ************************************ 00:09:38.659 START TEST accel_decomp_full_mthread 00:09:38.659 ************************************ 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:38.659 11:50:51 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:38.659 [2024-07-15 11:50:51.960978] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:38.659 [2024-07-15 11:50:51.961040] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1431768 ] 00:09:38.659 [2024-07-15 11:50:52.088967] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:38.659 [2024-07-15 11:50:52.190138] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:38.920 11:50:52 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:40.298 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:09:40.299 00:09:40.299 real 0m1.540s 00:09:40.299 user 0m1.350s 00:09:40.299 sys 0m0.197s 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:40.299 11:50:53 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:40.299 ************************************ 00:09:40.299 END TEST accel_decomp_full_mthread 00:09:40.299 ************************************ 00:09:40.299 11:50:53 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:40.299 11:50:53 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:09:40.299 11:50:53 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:09:40.299 11:50:53 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:09:40.299 11:50:53 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:40.299 11:50:53 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=1431964 00:09:40.299 11:50:53 accel -- accel/accel.sh@63 -- # waitforlisten 1431964 00:09:40.299 11:50:53 accel -- common/autotest_common.sh@829 -- # '[' -z 1431964 ']' 00:09:40.299 11:50:53 accel -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:40.299 11:50:53 accel -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:40.299 11:50:53 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:09:40.299 11:50:53 accel -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:40.299 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:40.299 11:50:53 accel -- accel/accel.sh@61 -- # build_accel_config 00:09:40.299 11:50:53 accel -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:40.299 11:50:53 accel -- common/autotest_common.sh@10 -- # set +x 00:09:40.299 11:50:53 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:40.299 11:50:53 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:40.299 11:50:53 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:40.299 11:50:53 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:40.299 11:50:53 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:40.299 11:50:53 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:40.299 11:50:53 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:40.299 11:50:53 accel -- accel/accel.sh@41 -- # jq -r . 00:09:40.299 [2024-07-15 11:50:53.580906] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:40.299 [2024-07-15 11:50:53.580977] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1431964 ] 00:09:40.299 [2024-07-15 11:50:53.710426] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:40.299 [2024-07-15 11:50:53.817202] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:41.238 [2024-07-15 11:50:54.582449] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:41.238 11:50:54 accel -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:09:41.238 11:50:54 accel -- common/autotest_common.sh@862 -- # return 0 00:09:41.238 11:50:54 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:09:41.238 11:50:54 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:09:41.238 11:50:54 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:09:41.238 11:50:54 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:09:41.238 11:50:54 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:09:41.238 11:50:54 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:09:41.238 11:50:54 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.238 11:50:54 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:09:41.238 11:50:54 accel -- common/autotest_common.sh@10 -- # set +x 00:09:41.238 11:50:54 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:09:41.499 11:50:54 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.499 "method": "compressdev_scan_accel_module", 00:09:41.499 11:50:54 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:09:41.499 11:50:54 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:09:41.499 11:50:54 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:09:41.499 11:50:54 accel -- common/autotest_common.sh@559 -- # xtrace_disable 00:09:41.499 11:50:54 accel -- common/autotest_common.sh@10 -- # set +x 00:09:41.499 11:50:54 accel -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:09:41.499 11:50:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:54 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:54 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:54 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:54 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:54 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:54 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:54 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # IFS== 00:09:41.499 11:50:55 accel -- accel/accel.sh@72 -- # read -r opc module 00:09:41.499 11:50:55 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:09:41.499 11:50:55 accel -- accel/accel.sh@75 -- # killprocess 1431964 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@948 -- # '[' -z 1431964 ']' 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@952 -- # kill -0 1431964 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@953 -- # uname 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1431964 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1431964' 00:09:41.499 killing process with pid 1431964 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@967 -- # kill 1431964 00:09:41.499 11:50:55 accel -- common/autotest_common.sh@972 -- # wait 1431964 00:09:42.068 11:50:55 accel -- accel/accel.sh@76 -- # trap - ERR 00:09:42.068 11:50:55 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:42.068 11:50:55 accel -- common/autotest_common.sh@1099 -- # '[' 8 -le 1 ']' 00:09:42.068 11:50:55 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:42.068 11:50:55 accel -- common/autotest_common.sh@10 -- # set +x 00:09:42.068 ************************************ 00:09:42.068 START TEST accel_cdev_comp 00:09:42.068 ************************************ 00:09:42.068 11:50:55 accel.accel_cdev_comp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:09:42.068 11:50:55 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:09:42.068 [2024-07-15 11:50:55.527768] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:42.068 [2024-07-15 11:50:55.527827] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1432326 ] 00:09:42.068 [2024-07-15 11:50:55.656637] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:42.326 [2024-07-15 11:50:55.763013] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:43.261 [2024-07-15 11:50:56.534443] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:43.261 [2024-07-15 11:50:56.537076] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2036100 PMD being used: compress_qat 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:43.261 [2024-07-15 11:50:56.541192] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x203ae40 PMD being used: compress_qat 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:09:43.261 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:43.262 11:50:56 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:44.198 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:09:44.199 11:50:57 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:44.199 00:09:44.199 real 0m2.230s 00:09:44.199 user 0m1.627s 00:09:44.199 sys 0m0.606s 00:09:44.199 11:50:57 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:44.199 11:50:57 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:09:44.199 ************************************ 00:09:44.199 END TEST accel_cdev_comp 00:09:44.199 ************************************ 00:09:44.199 11:50:57 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:44.199 11:50:57 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:44.199 11:50:57 accel -- common/autotest_common.sh@1099 -- # '[' 9 -le 1 ']' 00:09:44.199 11:50:57 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:44.199 11:50:57 accel -- common/autotest_common.sh@10 -- # set +x 00:09:44.458 ************************************ 00:09:44.458 START TEST accel_cdev_decomp 00:09:44.458 ************************************ 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:09:44.458 11:50:57 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:09:44.458 [2024-07-15 11:50:57.837781] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:44.458 [2024-07-15 11:50:57.837844] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1432604 ] 00:09:44.458 [2024-07-15 11:50:57.966703] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:44.717 [2024-07-15 11:50:58.067813] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:45.286 [2024-07-15 11:50:58.833707] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:45.286 [2024-07-15 11:50:58.836296] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23a5100 PMD being used: compress_qat 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:45.286 [2024-07-15 11:50:58.840516] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23a9e40 PMD being used: compress_qat 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:45.286 11:50:58 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:46.665 00:09:46.665 real 0m2.218s 00:09:46.665 user 0m1.634s 00:09:46.665 sys 0m0.587s 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:46.665 11:51:00 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:09:46.665 ************************************ 00:09:46.665 END TEST accel_cdev_decomp 00:09:46.665 ************************************ 00:09:46.665 11:51:00 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:46.665 11:51:00 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:46.665 11:51:00 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:46.665 11:51:00 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:46.665 11:51:00 accel -- common/autotest_common.sh@10 -- # set +x 00:09:46.665 ************************************ 00:09:46.665 START TEST accel_cdev_decomp_full 00:09:46.665 ************************************ 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:09:46.665 11:51:00 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:09:46.665 [2024-07-15 11:51:00.132654] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:46.665 [2024-07-15 11:51:00.132720] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1432910 ] 00:09:46.665 [2024-07-15 11:51:00.258692] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:46.924 [2024-07-15 11:51:00.358542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:47.861 [2024-07-15 11:51:01.128662] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:47.861 [2024-07-15 11:51:01.131254] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24e3100 PMD being used: compress_qat 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:47.861 [2024-07-15 11:51:01.134638] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24e63d0 PMD being used: compress_qat 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:47.861 11:51:01 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:48.797 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:48.798 00:09:48.798 real 0m2.214s 00:09:48.798 user 0m1.641s 00:09:48.798 sys 0m0.579s 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:48.798 11:51:02 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:09:48.798 ************************************ 00:09:48.798 END TEST accel_cdev_decomp_full 00:09:48.798 ************************************ 00:09:48.798 11:51:02 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:48.798 11:51:02 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:48.798 11:51:02 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:48.798 11:51:02 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:48.798 11:51:02 accel -- common/autotest_common.sh@10 -- # set +x 00:09:49.056 ************************************ 00:09:49.056 START TEST accel_cdev_decomp_mcore 00:09:49.056 ************************************ 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:49.056 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:49.057 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:49.057 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:49.057 11:51:02 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:49.057 [2024-07-15 11:51:02.446140] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:49.057 [2024-07-15 11:51:02.446266] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1433377 ] 00:09:49.057 [2024-07-15 11:51:02.640113] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:49.315 [2024-07-15 11:51:02.750157] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:49.315 [2024-07-15 11:51:02.750259] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:49.315 [2024-07-15 11:51:02.750359] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:49.315 [2024-07-15 11:51:02.750360] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:50.008 [2024-07-15 11:51:03.513598] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:50.008 [2024-07-15 11:51:03.516199] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24fb740 PMD being used: compress_qat 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:50.008 [2024-07-15 11:51:03.522191] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f381019b8b0 PMD being used: compress_qat 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 [2024-07-15 11:51:03.523743] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2500ac0 PMD being used: compress_qat 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 [2024-07-15 11:51:03.527483] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f380819b8b0 PMD being used: compress_qat 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:50.008 [2024-07-15 11:51:03.527817] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f380019b8b0 PMD being used: compress_qat 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:50.008 11:51:03 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:51.387 00:09:51.387 real 0m2.327s 00:09:51.387 user 0m7.265s 00:09:51.387 sys 0m0.649s 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:51.387 11:51:04 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:51.387 ************************************ 00:09:51.387 END TEST accel_cdev_decomp_mcore 00:09:51.387 ************************************ 00:09:51.387 11:51:04 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:51.387 11:51:04 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:51.387 11:51:04 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:51.387 11:51:04 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:51.387 11:51:04 accel -- common/autotest_common.sh@10 -- # set +x 00:09:51.387 ************************************ 00:09:51.387 START TEST accel_cdev_decomp_full_mcore 00:09:51.387 ************************************ 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:09:51.387 11:51:04 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:09:51.387 [2024-07-15 11:51:04.857327] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:51.387 [2024-07-15 11:51:04.857455] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1433903 ] 00:09:51.647 [2024-07-15 11:51:05.055679] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:09:51.647 [2024-07-15 11:51:05.164287] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:51.647 [2024-07-15 11:51:05.164392] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:51.647 [2024-07-15 11:51:05.164476] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:09:51.647 [2024-07-15 11:51:05.164477] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:52.585 [2024-07-15 11:51:05.934040] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:52.585 [2024-07-15 11:51:05.936666] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xcf9740 PMD being used: compress_qat 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 [2024-07-15 11:51:05.941706] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa11419b8b0 PMD being used: compress_qat 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 [2024-07-15 11:51:05.943618] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xcf97e0 PMD being used: compress_qat 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:52.585 [2024-07-15 11:51:05.946556] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa10c19b8b0 PMD being used: compress_qat 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 [2024-07-15 11:51:05.946934] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7fa10419b8b0 PMD being used: compress_qat 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:52.585 11:51:05 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.964 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:53.965 00:09:53.965 real 0m2.324s 00:09:53.965 user 0m7.260s 00:09:53.965 sys 0m0.640s 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:53.965 11:51:07 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:09:53.965 ************************************ 00:09:53.965 END TEST accel_cdev_decomp_full_mcore 00:09:53.965 ************************************ 00:09:53.965 11:51:07 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:53.965 11:51:07 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:53.965 11:51:07 accel -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:09:53.965 11:51:07 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:53.965 11:51:07 accel -- common/autotest_common.sh@10 -- # set +x 00:09:53.965 ************************************ 00:09:53.965 START TEST accel_cdev_decomp_mthread 00:09:53.965 ************************************ 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:53.965 11:51:07 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:53.965 [2024-07-15 11:51:07.247071] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:53.965 [2024-07-15 11:51:07.247132] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1434349 ] 00:09:53.965 [2024-07-15 11:51:07.375750] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:53.965 [2024-07-15 11:51:07.476926] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:54.903 [2024-07-15 11:51:08.248763] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:54.903 [2024-07-15 11:51:08.251374] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ad4100 PMD being used: compress_qat 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:54.903 [2024-07-15 11:51:08.256384] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ad9240 PMD being used: compress_qat 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 [2024-07-15 11:51:08.259007] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1bfc070 PMD being used: compress_qat 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:54.903 11:51:08 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:55.841 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:56.101 00:09:56.101 real 0m2.228s 00:09:56.101 user 0m1.655s 00:09:56.101 sys 0m0.570s 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:56.101 11:51:09 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:56.101 ************************************ 00:09:56.101 END TEST accel_cdev_decomp_mthread 00:09:56.101 ************************************ 00:09:56.101 11:51:09 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:56.101 11:51:09 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:56.101 11:51:09 accel -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:09:56.101 11:51:09 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:56.101 11:51:09 accel -- common/autotest_common.sh@10 -- # set +x 00:09:56.101 ************************************ 00:09:56.101 START TEST accel_cdev_decomp_full_mthread 00:09:56.101 ************************************ 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1123 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:09:56.101 11:51:09 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:09:56.101 [2024-07-15 11:51:09.563758] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:56.101 [2024-07-15 11:51:09.563823] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1434707 ] 00:09:56.101 [2024-07-15 11:51:09.695170] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:56.360 [2024-07-15 11:51:09.801710] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:57.297 [2024-07-15 11:51:10.563528] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:09:57.297 [2024-07-15 11:51:10.566177] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x177b100 PMD being used: compress_qat 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:57.297 [2024-07-15 11:51:10.570419] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x177e3d0 PMD being used: compress_qat 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:57.297 [2024-07-15 11:51:10.573267] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x18a6cd0 PMD being used: compress_qat 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.297 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:57.298 11:51:10 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.237 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:09:58.238 00:09:58.238 real 0m2.231s 00:09:58.238 user 0m1.647s 00:09:58.238 sys 0m0.590s 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:58.238 11:51:11 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:09:58.238 ************************************ 00:09:58.238 END TEST accel_cdev_decomp_full_mthread 00:09:58.238 ************************************ 00:09:58.238 11:51:11 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:58.238 11:51:11 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:09:58.238 11:51:11 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:58.238 11:51:11 accel -- accel/accel.sh@137 -- # build_accel_config 00:09:58.238 11:51:11 accel -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:09:58.238 11:51:11 accel -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:58.238 11:51:11 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:09:58.238 11:51:11 accel -- common/autotest_common.sh@10 -- # set +x 00:09:58.238 11:51:11 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:09:58.238 11:51:11 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:09:58.238 11:51:11 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:09:58.238 11:51:11 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:09:58.238 11:51:11 accel -- accel/accel.sh@40 -- # local IFS=, 00:09:58.238 11:51:11 accel -- accel/accel.sh@41 -- # jq -r . 00:09:58.561 ************************************ 00:09:58.561 START TEST accel_dif_functional_tests 00:09:58.561 ************************************ 00:09:58.561 11:51:11 accel.accel_dif_functional_tests -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:09:58.561 [2024-07-15 11:51:11.906886] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:58.561 [2024-07-15 11:51:11.906947] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1435074 ] 00:09:58.561 [2024-07-15 11:51:12.035369] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:58.821 [2024-07-15 11:51:12.138636] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:58.821 [2024-07-15 11:51:12.138743] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:58.821 [2024-07-15 11:51:12.138746] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:58.821 00:09:58.821 00:09:58.821 CUnit - A unit testing framework for C - Version 2.1-3 00:09:58.821 http://cunit.sourceforge.net/ 00:09:58.821 00:09:58.821 00:09:58.821 Suite: accel_dif 00:09:58.821 Test: verify: DIF generated, GUARD check ...passed 00:09:58.821 Test: verify: DIF generated, APPTAG check ...passed 00:09:58.821 Test: verify: DIF generated, REFTAG check ...passed 00:09:58.821 Test: verify: DIF not generated, GUARD check ...[2024-07-15 11:51:12.234025] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:58.821 passed 00:09:58.821 Test: verify: DIF not generated, APPTAG check ...[2024-07-15 11:51:12.234099] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:58.821 passed 00:09:58.821 Test: verify: DIF not generated, REFTAG check ...[2024-07-15 11:51:12.234137] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:58.821 passed 00:09:58.821 Test: verify: APPTAG correct, APPTAG check ...passed 00:09:58.821 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-15 11:51:12.234214] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:09:58.821 passed 00:09:58.821 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:09:58.821 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:09:58.821 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:09:58.821 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-15 11:51:12.234378] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:09:58.821 passed 00:09:58.821 Test: verify copy: DIF generated, GUARD check ...passed 00:09:58.821 Test: verify copy: DIF generated, APPTAG check ...passed 00:09:58.821 Test: verify copy: DIF generated, REFTAG check ...passed 00:09:58.821 Test: verify copy: DIF not generated, GUARD check ...[2024-07-15 11:51:12.234562] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:09:58.821 passed 00:09:58.821 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-15 11:51:12.234606] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:09:58.821 passed 00:09:58.821 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-15 11:51:12.234644] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:09:58.821 passed 00:09:58.821 Test: generate copy: DIF generated, GUARD check ...passed 00:09:58.821 Test: generate copy: DIF generated, APTTAG check ...passed 00:09:58.821 Test: generate copy: DIF generated, REFTAG check ...passed 00:09:58.821 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:09:58.821 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:09:58.821 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:09:58.821 Test: generate copy: iovecs-len validate ...[2024-07-15 11:51:12.234928] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:09:58.821 passed 00:09:58.821 Test: generate copy: buffer alignment validate ...passed 00:09:58.821 00:09:58.821 Run Summary: Type Total Ran Passed Failed Inactive 00:09:58.821 suites 1 1 n/a 0 0 00:09:58.821 tests 26 26 26 0 0 00:09:58.821 asserts 115 115 115 0 n/a 00:09:58.821 00:09:58.821 Elapsed time = 0.003 seconds 00:09:59.080 00:09:59.080 real 0m0.610s 00:09:59.080 user 0m0.782s 00:09:59.080 sys 0m0.224s 00:09:59.080 11:51:12 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:59.080 11:51:12 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:09:59.080 ************************************ 00:09:59.080 END TEST accel_dif_functional_tests 00:09:59.080 ************************************ 00:09:59.080 11:51:12 accel -- common/autotest_common.sh@1142 -- # return 0 00:09:59.080 00:09:59.080 real 0m54.252s 00:09:59.080 user 1m2.162s 00:09:59.080 sys 0m12.363s 00:09:59.080 11:51:12 accel -- common/autotest_common.sh@1124 -- # xtrace_disable 00:09:59.080 11:51:12 accel -- common/autotest_common.sh@10 -- # set +x 00:09:59.080 ************************************ 00:09:59.080 END TEST accel 00:09:59.080 ************************************ 00:09:59.080 11:51:12 -- common/autotest_common.sh@1142 -- # return 0 00:09:59.080 11:51:12 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:59.080 11:51:12 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:09:59.080 11:51:12 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:09:59.080 11:51:12 -- common/autotest_common.sh@10 -- # set +x 00:09:59.080 ************************************ 00:09:59.080 START TEST accel_rpc 00:09:59.080 ************************************ 00:09:59.080 11:51:12 accel_rpc -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:09:59.339 * Looking for test storage... 00:09:59.339 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:09:59.339 11:51:12 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:09:59.339 11:51:12 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=1435151 00:09:59.339 11:51:12 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 1435151 00:09:59.339 11:51:12 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:09:59.339 11:51:12 accel_rpc -- common/autotest_common.sh@829 -- # '[' -z 1435151 ']' 00:09:59.339 11:51:12 accel_rpc -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:59.339 11:51:12 accel_rpc -- common/autotest_common.sh@834 -- # local max_retries=100 00:09:59.339 11:51:12 accel_rpc -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:59.339 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:59.339 11:51:12 accel_rpc -- common/autotest_common.sh@838 -- # xtrace_disable 00:09:59.339 11:51:12 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:09:59.339 [2024-07-15 11:51:12.759025] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:09:59.339 [2024-07-15 11:51:12.759100] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1435151 ] 00:09:59.339 [2024-07-15 11:51:12.881126] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:59.598 [2024-07-15 11:51:12.980236] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:00.166 11:51:13 accel_rpc -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:00.166 11:51:13 accel_rpc -- common/autotest_common.sh@862 -- # return 0 00:10:00.166 11:51:13 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:10:00.166 11:51:13 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:10:00.166 11:51:13 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:10:00.166 11:51:13 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:10:00.166 11:51:13 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:10:00.166 11:51:13 accel_rpc -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:00.166 11:51:13 accel_rpc -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:00.166 11:51:13 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:00.166 ************************************ 00:10:00.166 START TEST accel_assign_opcode 00:10:00.166 ************************************ 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1123 -- # accel_assign_opcode_test_suite 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:00.166 [2024-07-15 11:51:13.734615] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:00.166 [2024-07-15 11:51:13.742629] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:00.166 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:00.424 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:00.424 11:51:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:10:00.424 11:51:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:10:00.424 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:00.424 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:00.424 11:51:13 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:10:00.424 11:51:13 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:00.424 software 00:10:00.424 00:10:00.424 real 0m0.285s 00:10:00.424 user 0m0.052s 00:10:00.424 sys 0m0.010s 00:10:00.424 11:51:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:00.424 11:51:14 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:10:00.424 ************************************ 00:10:00.424 END TEST accel_assign_opcode 00:10:00.424 ************************************ 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@1142 -- # return 0 00:10:00.683 11:51:14 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 1435151 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@948 -- # '[' -z 1435151 ']' 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@952 -- # kill -0 1435151 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@953 -- # uname 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1435151 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1435151' 00:10:00.683 killing process with pid 1435151 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@967 -- # kill 1435151 00:10:00.683 11:51:14 accel_rpc -- common/autotest_common.sh@972 -- # wait 1435151 00:10:00.942 00:10:00.942 real 0m1.891s 00:10:00.942 user 0m1.977s 00:10:00.942 sys 0m0.581s 00:10:00.942 11:51:14 accel_rpc -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:00.942 11:51:14 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:10:00.942 ************************************ 00:10:00.942 END TEST accel_rpc 00:10:00.942 ************************************ 00:10:00.942 11:51:14 -- common/autotest_common.sh@1142 -- # return 0 00:10:00.942 11:51:14 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:10:00.942 11:51:14 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:00.942 11:51:14 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:00.942 11:51:14 -- common/autotest_common.sh@10 -- # set +x 00:10:01.202 ************************************ 00:10:01.202 START TEST app_cmdline 00:10:01.202 ************************************ 00:10:01.202 11:51:14 app_cmdline -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:10:01.202 * Looking for test storage... 00:10:01.202 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:10:01.202 11:51:14 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:10:01.202 11:51:14 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=1435558 00:10:01.202 11:51:14 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 1435558 00:10:01.202 11:51:14 app_cmdline -- common/autotest_common.sh@829 -- # '[' -z 1435558 ']' 00:10:01.202 11:51:14 app_cmdline -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:01.202 11:51:14 app_cmdline -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:01.202 11:51:14 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:10:01.202 11:51:14 app_cmdline -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:01.202 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:01.202 11:51:14 app_cmdline -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:01.202 11:51:14 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:10:01.461 [2024-07-15 11:51:14.798653] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:10:01.461 [2024-07-15 11:51:14.798813] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1435558 ] 00:10:01.461 [2024-07-15 11:51:14.993428] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:01.721 [2024-07-15 11:51:15.100561] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:01.981 11:51:15 app_cmdline -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:01.981 11:51:15 app_cmdline -- common/autotest_common.sh@862 -- # return 0 00:10:01.981 11:51:15 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:10:02.550 { 00:10:02.550 "version": "SPDK v24.09-pre git sha1 2728651ee", 00:10:02.550 "fields": { 00:10:02.550 "major": 24, 00:10:02.550 "minor": 9, 00:10:02.550 "patch": 0, 00:10:02.550 "suffix": "-pre", 00:10:02.550 "commit": "2728651ee" 00:10:02.550 } 00:10:02.550 } 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@26 -- # sort 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:10:02.550 11:51:15 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@648 -- # local es=0 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:02.550 11:51:15 app_cmdline -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:10:02.550 request: 00:10:02.550 { 00:10:02.550 "method": "env_dpdk_get_mem_stats", 00:10:02.550 "req_id": 1 00:10:02.550 } 00:10:02.550 Got JSON-RPC error response 00:10:02.550 response: 00:10:02.550 { 00:10:02.550 "code": -32601, 00:10:02.550 "message": "Method not found" 00:10:02.550 } 00:10:02.550 11:51:16 app_cmdline -- common/autotest_common.sh@651 -- # es=1 00:10:02.550 11:51:16 app_cmdline -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:10:02.550 11:51:16 app_cmdline -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:10:02.550 11:51:16 app_cmdline -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:10:02.550 11:51:16 app_cmdline -- app/cmdline.sh@1 -- # killprocess 1435558 00:10:02.550 11:51:16 app_cmdline -- common/autotest_common.sh@948 -- # '[' -z 1435558 ']' 00:10:02.550 11:51:16 app_cmdline -- common/autotest_common.sh@952 -- # kill -0 1435558 00:10:02.550 11:51:16 app_cmdline -- common/autotest_common.sh@953 -- # uname 00:10:02.550 11:51:16 app_cmdline -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:02.550 11:51:16 app_cmdline -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1435558 00:10:02.810 11:51:16 app_cmdline -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:02.810 11:51:16 app_cmdline -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:02.810 11:51:16 app_cmdline -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1435558' 00:10:02.810 killing process with pid 1435558 00:10:02.810 11:51:16 app_cmdline -- common/autotest_common.sh@967 -- # kill 1435558 00:10:02.810 11:51:16 app_cmdline -- common/autotest_common.sh@972 -- # wait 1435558 00:10:03.069 00:10:03.069 real 0m1.976s 00:10:03.069 user 0m2.549s 00:10:03.069 sys 0m0.707s 00:10:03.069 11:51:16 app_cmdline -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:03.069 11:51:16 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:10:03.069 ************************************ 00:10:03.069 END TEST app_cmdline 00:10:03.069 ************************************ 00:10:03.069 11:51:16 -- common/autotest_common.sh@1142 -- # return 0 00:10:03.069 11:51:16 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:10:03.069 11:51:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:03.069 11:51:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:03.069 11:51:16 -- common/autotest_common.sh@10 -- # set +x 00:10:03.069 ************************************ 00:10:03.069 START TEST version 00:10:03.069 ************************************ 00:10:03.069 11:51:16 version -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:10:03.328 * Looking for test storage... 00:10:03.328 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:10:03.328 11:51:16 version -- app/version.sh@17 -- # get_header_version major 00:10:03.328 11:51:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:03.328 11:51:16 version -- app/version.sh@14 -- # cut -f2 00:10:03.328 11:51:16 version -- app/version.sh@14 -- # tr -d '"' 00:10:03.328 11:51:16 version -- app/version.sh@17 -- # major=24 00:10:03.328 11:51:16 version -- app/version.sh@18 -- # get_header_version minor 00:10:03.328 11:51:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:03.328 11:51:16 version -- app/version.sh@14 -- # cut -f2 00:10:03.328 11:51:16 version -- app/version.sh@14 -- # tr -d '"' 00:10:03.328 11:51:16 version -- app/version.sh@18 -- # minor=9 00:10:03.328 11:51:16 version -- app/version.sh@19 -- # get_header_version patch 00:10:03.328 11:51:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:03.328 11:51:16 version -- app/version.sh@14 -- # cut -f2 00:10:03.328 11:51:16 version -- app/version.sh@14 -- # tr -d '"' 00:10:03.328 11:51:16 version -- app/version.sh@19 -- # patch=0 00:10:03.328 11:51:16 version -- app/version.sh@20 -- # get_header_version suffix 00:10:03.328 11:51:16 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:10:03.328 11:51:16 version -- app/version.sh@14 -- # cut -f2 00:10:03.328 11:51:16 version -- app/version.sh@14 -- # tr -d '"' 00:10:03.328 11:51:16 version -- app/version.sh@20 -- # suffix=-pre 00:10:03.328 11:51:16 version -- app/version.sh@22 -- # version=24.9 00:10:03.328 11:51:16 version -- app/version.sh@25 -- # (( patch != 0 )) 00:10:03.328 11:51:16 version -- app/version.sh@28 -- # version=24.9rc0 00:10:03.328 11:51:16 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:10:03.328 11:51:16 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:10:03.328 11:51:16 version -- app/version.sh@30 -- # py_version=24.9rc0 00:10:03.328 11:51:16 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:10:03.328 00:10:03.328 real 0m0.193s 00:10:03.328 user 0m0.089s 00:10:03.328 sys 0m0.153s 00:10:03.328 11:51:16 version -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:03.328 11:51:16 version -- common/autotest_common.sh@10 -- # set +x 00:10:03.328 ************************************ 00:10:03.328 END TEST version 00:10:03.328 ************************************ 00:10:03.328 11:51:16 -- common/autotest_common.sh@1142 -- # return 0 00:10:03.328 11:51:16 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:10:03.328 11:51:16 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:10:03.328 11:51:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:10:03.328 11:51:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:03.328 11:51:16 -- common/autotest_common.sh@10 -- # set +x 00:10:03.328 ************************************ 00:10:03.328 START TEST blockdev_general 00:10:03.328 ************************************ 00:10:03.328 11:51:16 blockdev_general -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:10:03.588 * Looking for test storage... 00:10:03.588 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:03.588 11:51:17 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1435997 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:10:03.588 11:51:17 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 1435997 00:10:03.588 11:51:17 blockdev_general -- common/autotest_common.sh@829 -- # '[' -z 1435997 ']' 00:10:03.588 11:51:17 blockdev_general -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:03.588 11:51:17 blockdev_general -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:03.588 11:51:17 blockdev_general -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:03.588 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:03.588 11:51:17 blockdev_general -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:03.588 11:51:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:03.588 [2024-07-15 11:51:17.095578] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:10:03.588 [2024-07-15 11:51:17.095648] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1435997 ] 00:10:03.847 [2024-07-15 11:51:17.221833] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:03.847 [2024-07-15 11:51:17.319141] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:04.786 11:51:18 blockdev_general -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:04.786 11:51:18 blockdev_general -- common/autotest_common.sh@862 -- # return 0 00:10:04.786 11:51:18 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:10:04.786 11:51:18 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:10:04.786 11:51:18 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:10:04.786 11:51:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:04.786 11:51:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:04.786 [2024-07-15 11:51:18.255915] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:04.786 [2024-07-15 11:51:18.255967] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:04.786 00:10:04.786 [2024-07-15 11:51:18.263905] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:04.786 [2024-07-15 11:51:18.263932] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:04.786 00:10:04.786 Malloc0 00:10:04.786 Malloc1 00:10:04.786 Malloc2 00:10:04.786 Malloc3 00:10:04.786 Malloc4 00:10:04.786 Malloc5 00:10:04.786 Malloc6 00:10:04.786 Malloc7 00:10:05.045 Malloc8 00:10:05.045 Malloc9 00:10:05.045 [2024-07-15 11:51:18.400134] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:05.045 [2024-07-15 11:51:18.400180] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:05.045 [2024-07-15 11:51:18.400199] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bc6060 00:10:05.045 [2024-07-15 11:51:18.400212] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:05.045 [2024-07-15 11:51:18.401561] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:05.045 [2024-07-15 11:51:18.401589] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:05.045 TestPT 00:10:05.045 11:51:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.045 11:51:18 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:10:05.045 5000+0 records in 00:10:05.045 5000+0 records out 00:10:05.045 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0278187 s, 368 MB/s 00:10:05.045 11:51:18 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:10:05.045 11:51:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.045 11:51:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:05.045 AIO0 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.046 11:51:18 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.046 11:51:18 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:10:05.046 11:51:18 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.046 11:51:18 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.046 11:51:18 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.046 11:51:18 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:10:05.046 11:51:18 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@559 -- # xtrace_disable 00:10:05.046 11:51:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:05.046 11:51:18 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:10:05.306 11:51:18 blockdev_general -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:10:05.306 11:51:18 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:10:05.306 11:51:18 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:10:05.308 11:51:18 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ab12ddf7-10cb-4706-b517-b248bc2afc4d"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ab12ddf7-10cb-4706-b517-b248bc2afc4d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "2253fc13-25f2-52e2-b62d-6fd61864c6f9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "2253fc13-25f2-52e2-b62d-6fd61864c6f9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "dc788c59-8f42-55b6-8ed5-1ff0432fa21c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dc788c59-8f42-55b6-8ed5-1ff0432fa21c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "15c91d20-bbd4-5ede-a89d-6661c16a5ded"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "15c91d20-bbd4-5ede-a89d-6661c16a5ded",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "4a3867bc-2413-5fbc-accb-d3dfb13c0b7d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4a3867bc-2413-5fbc-accb-d3dfb13c0b7d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "f7cf4fcd-1cdd-50c0-ba32-923a741b9612"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f7cf4fcd-1cdd-50c0-ba32-923a741b9612",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "85920319-cf12-53ae-8c1c-1720b376fae8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "85920319-cf12-53ae-8c1c-1720b376fae8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "b2abe9f1-8d7a-5526-9092-30565e923c78"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b2abe9f1-8d7a-5526-9092-30565e923c78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "2e0850d1-5cdb-5ad8-b345-677c96a9baa9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2e0850d1-5cdb-5ad8-b345-677c96a9baa9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "d58b8176-7bed-5190-852a-a4bade7b7069"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d58b8176-7bed-5190-852a-a4bade7b7069",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "bf618af8-6c69-577e-b205-471eca47978a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bf618af8-6c69-577e-b205-471eca47978a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "482c6328-8eef-5d90-a5cd-eb71d288d612"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "482c6328-8eef-5d90-a5cd-eb71d288d612",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "30e250e7-da5e-4319-8706-27566513e1e3"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "30e250e7-da5e-4319-8706-27566513e1e3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "30e250e7-da5e-4319-8706-27566513e1e3",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "dc2095c8-8690-4dcf-97d0-8b99c775ed60",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "db5e3a0f-26e4-4c55-8637-79b9a3ee42ea",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "8f77d098-b4fe-4084-8986-edc119c38216"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "8f77d098-b4fe-4084-8986-edc119c38216",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "8f77d098-b4fe-4084-8986-edc119c38216",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "d16d9db0-3e4b-49d8-9f3d-ffddbc74d623",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "557ebcf3-70cc-413b-80ea-2b2360cfc9e1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "9d6ca8c1-6620-4d73-ac32-e8a1ebf05c8e"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9d6ca8c1-6620-4d73-ac32-e8a1ebf05c8e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "9d6ca8c1-6620-4d73-ac32-e8a1ebf05c8e",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "862d35fb-c5fd-4811-9008-b6281e1db460",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "abc4f20c-6db1-4e15-b8f3-add206282ceb",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "bc8ff510-e2b8-4026-9746-d14bf6770e02"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "bc8ff510-e2b8-4026-9746-d14bf6770e02",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:05.308 11:51:18 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:10:05.308 11:51:18 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:10:05.308 11:51:18 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:10:05.308 11:51:18 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 1435997 00:10:05.308 11:51:18 blockdev_general -- common/autotest_common.sh@948 -- # '[' -z 1435997 ']' 00:10:05.308 11:51:18 blockdev_general -- common/autotest_common.sh@952 -- # kill -0 1435997 00:10:05.308 11:51:18 blockdev_general -- common/autotest_common.sh@953 -- # uname 00:10:05.308 11:51:18 blockdev_general -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:05.308 11:51:18 blockdev_general -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1435997 00:10:05.567 11:51:18 blockdev_general -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:05.567 11:51:18 blockdev_general -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:05.567 11:51:18 blockdev_general -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1435997' 00:10:05.567 killing process with pid 1435997 00:10:05.567 11:51:18 blockdev_general -- common/autotest_common.sh@967 -- # kill 1435997 00:10:05.567 11:51:18 blockdev_general -- common/autotest_common.sh@972 -- # wait 1435997 00:10:05.826 11:51:19 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:10:05.826 11:51:19 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:10:05.826 11:51:19 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:10:05.826 11:51:19 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:05.826 11:51:19 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:06.084 ************************************ 00:10:06.084 START TEST bdev_hello_world 00:10:06.084 ************************************ 00:10:06.084 11:51:19 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:10:06.084 [2024-07-15 11:51:19.521158] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:10:06.084 [2024-07-15 11:51:19.521237] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1436246 ] 00:10:06.084 [2024-07-15 11:51:19.653601] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:06.345 [2024-07-15 11:51:19.762961] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:06.345 [2024-07-15 11:51:19.920039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:06.345 [2024-07-15 11:51:19.920095] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:06.345 [2024-07-15 11:51:19.920111] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:06.345 [2024-07-15 11:51:19.928044] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:06.345 [2024-07-15 11:51:19.928071] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:06.345 [2024-07-15 11:51:19.936052] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:06.345 [2024-07-15 11:51:19.936077] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:06.628 [2024-07-15 11:51:20.014972] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:06.628 [2024-07-15 11:51:20.015038] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:06.628 [2024-07-15 11:51:20.015060] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1034e00 00:10:06.628 [2024-07-15 11:51:20.015078] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:06.628 [2024-07-15 11:51:20.016853] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:06.628 [2024-07-15 11:51:20.016893] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:06.628 [2024-07-15 11:51:20.156134] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:10:06.628 [2024-07-15 11:51:20.156205] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:10:06.628 [2024-07-15 11:51:20.156261] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:10:06.628 [2024-07-15 11:51:20.156338] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:10:06.628 [2024-07-15 11:51:20.156414] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:10:06.628 [2024-07-15 11:51:20.156445] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:10:06.628 [2024-07-15 11:51:20.156509] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:10:06.628 00:10:06.628 [2024-07-15 11:51:20.156550] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:10:07.195 00:10:07.195 real 0m1.041s 00:10:07.195 user 0m0.690s 00:10:07.195 sys 0m0.312s 00:10:07.195 11:51:20 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:07.195 11:51:20 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:10:07.195 ************************************ 00:10:07.195 END TEST bdev_hello_world 00:10:07.195 ************************************ 00:10:07.195 11:51:20 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:07.195 11:51:20 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:10:07.195 11:51:20 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:07.195 11:51:20 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:07.195 11:51:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.195 ************************************ 00:10:07.195 START TEST bdev_bounds 00:10:07.195 ************************************ 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1436430 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1436430' 00:10:07.195 Process bdevio pid: 1436430 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1436430 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1436430 ']' 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:07.195 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:07.195 11:51:20 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:10:07.195 [2024-07-15 11:51:20.644537] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:10:07.195 [2024-07-15 11:51:20.644589] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1436430 ] 00:10:07.195 [2024-07-15 11:51:20.759530] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:10:07.452 [2024-07-15 11:51:20.869977] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:07.452 [2024-07-15 11:51:20.870076] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:10:07.452 [2024-07-15 11:51:20.870077] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:07.452 [2024-07-15 11:51:21.018557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:07.452 [2024-07-15 11:51:21.018618] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:07.452 [2024-07-15 11:51:21.018633] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:07.452 [2024-07-15 11:51:21.026571] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:07.452 [2024-07-15 11:51:21.026599] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:07.452 [2024-07-15 11:51:21.034585] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:07.452 [2024-07-15 11:51:21.034611] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:07.710 [2024-07-15 11:51:21.107184] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:07.710 [2024-07-15 11:51:21.107236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:07.710 [2024-07-15 11:51:21.107253] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x194b4c0 00:10:07.710 [2024-07-15 11:51:21.107271] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:07.710 [2024-07-15 11:51:21.108927] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:07.710 [2024-07-15 11:51:21.108959] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:08.278 11:51:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:08.278 11:51:21 blockdev_general.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:10:08.278 11:51:21 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:10:08.278 I/O targets: 00:10:08.278 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:10:08.278 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:10:08.278 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:10:08.278 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:10:08.278 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:10:08.278 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:10:08.278 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:10:08.278 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:10:08.278 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:10:08.278 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:10:08.278 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:10:08.278 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:10:08.278 raid0: 131072 blocks of 512 bytes (64 MiB) 00:10:08.278 concat0: 131072 blocks of 512 bytes (64 MiB) 00:10:08.278 raid1: 65536 blocks of 512 bytes (32 MiB) 00:10:08.278 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:10:08.278 00:10:08.278 00:10:08.278 CUnit - A unit testing framework for C - Version 2.1-3 00:10:08.278 http://cunit.sourceforge.net/ 00:10:08.278 00:10:08.278 00:10:08.278 Suite: bdevio tests on: AIO0 00:10:08.278 Test: blockdev write read block ...passed 00:10:08.278 Test: blockdev write zeroes read block ...passed 00:10:08.278 Test: blockdev write zeroes read no split ...passed 00:10:08.278 Test: blockdev write zeroes read split ...passed 00:10:08.278 Test: blockdev write zeroes read split partial ...passed 00:10:08.278 Test: blockdev reset ...passed 00:10:08.278 Test: blockdev write read 8 blocks ...passed 00:10:08.278 Test: blockdev write read size > 128k ...passed 00:10:08.278 Test: blockdev write read invalid size ...passed 00:10:08.278 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.278 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.278 Test: blockdev write read max offset ...passed 00:10:08.278 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.278 Test: blockdev writev readv 8 blocks ...passed 00:10:08.278 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.278 Test: blockdev writev readv block ...passed 00:10:08.278 Test: blockdev writev readv size > 128k ...passed 00:10:08.278 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.278 Test: blockdev comparev and writev ...passed 00:10:08.278 Test: blockdev nvme passthru rw ...passed 00:10:08.278 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.278 Test: blockdev nvme admin passthru ...passed 00:10:08.278 Test: blockdev copy ...passed 00:10:08.278 Suite: bdevio tests on: raid1 00:10:08.278 Test: blockdev write read block ...passed 00:10:08.278 Test: blockdev write zeroes read block ...passed 00:10:08.278 Test: blockdev write zeroes read no split ...passed 00:10:08.278 Test: blockdev write zeroes read split ...passed 00:10:08.278 Test: blockdev write zeroes read split partial ...passed 00:10:08.278 Test: blockdev reset ...passed 00:10:08.278 Test: blockdev write read 8 blocks ...passed 00:10:08.278 Test: blockdev write read size > 128k ...passed 00:10:08.278 Test: blockdev write read invalid size ...passed 00:10:08.278 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.278 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.278 Test: blockdev write read max offset ...passed 00:10:08.278 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.278 Test: blockdev writev readv 8 blocks ...passed 00:10:08.278 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.278 Test: blockdev writev readv block ...passed 00:10:08.278 Test: blockdev writev readv size > 128k ...passed 00:10:08.278 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.278 Test: blockdev comparev and writev ...passed 00:10:08.278 Test: blockdev nvme passthru rw ...passed 00:10:08.278 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.278 Test: blockdev nvme admin passthru ...passed 00:10:08.278 Test: blockdev copy ...passed 00:10:08.278 Suite: bdevio tests on: concat0 00:10:08.278 Test: blockdev write read block ...passed 00:10:08.278 Test: blockdev write zeroes read block ...passed 00:10:08.278 Test: blockdev write zeroes read no split ...passed 00:10:08.278 Test: blockdev write zeroes read split ...passed 00:10:08.278 Test: blockdev write zeroes read split partial ...passed 00:10:08.278 Test: blockdev reset ...passed 00:10:08.278 Test: blockdev write read 8 blocks ...passed 00:10:08.278 Test: blockdev write read size > 128k ...passed 00:10:08.278 Test: blockdev write read invalid size ...passed 00:10:08.278 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.278 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.278 Test: blockdev write read max offset ...passed 00:10:08.278 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.278 Test: blockdev writev readv 8 blocks ...passed 00:10:08.278 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.278 Test: blockdev writev readv block ...passed 00:10:08.278 Test: blockdev writev readv size > 128k ...passed 00:10:08.278 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.278 Test: blockdev comparev and writev ...passed 00:10:08.278 Test: blockdev nvme passthru rw ...passed 00:10:08.278 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.278 Test: blockdev nvme admin passthru ...passed 00:10:08.278 Test: blockdev copy ...passed 00:10:08.278 Suite: bdevio tests on: raid0 00:10:08.278 Test: blockdev write read block ...passed 00:10:08.278 Test: blockdev write zeroes read block ...passed 00:10:08.278 Test: blockdev write zeroes read no split ...passed 00:10:08.278 Test: blockdev write zeroes read split ...passed 00:10:08.278 Test: blockdev write zeroes read split partial ...passed 00:10:08.278 Test: blockdev reset ...passed 00:10:08.278 Test: blockdev write read 8 blocks ...passed 00:10:08.278 Test: blockdev write read size > 128k ...passed 00:10:08.278 Test: blockdev write read invalid size ...passed 00:10:08.278 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.278 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.278 Test: blockdev write read max offset ...passed 00:10:08.278 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.278 Test: blockdev writev readv 8 blocks ...passed 00:10:08.278 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.278 Test: blockdev writev readv block ...passed 00:10:08.278 Test: blockdev writev readv size > 128k ...passed 00:10:08.278 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.278 Test: blockdev comparev and writev ...passed 00:10:08.278 Test: blockdev nvme passthru rw ...passed 00:10:08.278 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.278 Test: blockdev nvme admin passthru ...passed 00:10:08.278 Test: blockdev copy ...passed 00:10:08.278 Suite: bdevio tests on: TestPT 00:10:08.278 Test: blockdev write read block ...passed 00:10:08.278 Test: blockdev write zeroes read block ...passed 00:10:08.278 Test: blockdev write zeroes read no split ...passed 00:10:08.278 Test: blockdev write zeroes read split ...passed 00:10:08.278 Test: blockdev write zeroes read split partial ...passed 00:10:08.278 Test: blockdev reset ...passed 00:10:08.278 Test: blockdev write read 8 blocks ...passed 00:10:08.278 Test: blockdev write read size > 128k ...passed 00:10:08.278 Test: blockdev write read invalid size ...passed 00:10:08.278 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.278 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.278 Test: blockdev write read max offset ...passed 00:10:08.278 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.279 Test: blockdev writev readv 8 blocks ...passed 00:10:08.279 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.279 Test: blockdev writev readv block ...passed 00:10:08.279 Test: blockdev writev readv size > 128k ...passed 00:10:08.279 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.279 Test: blockdev comparev and writev ...passed 00:10:08.279 Test: blockdev nvme passthru rw ...passed 00:10:08.279 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.279 Test: blockdev nvme admin passthru ...passed 00:10:08.279 Test: blockdev copy ...passed 00:10:08.279 Suite: bdevio tests on: Malloc2p7 00:10:08.279 Test: blockdev write read block ...passed 00:10:08.279 Test: blockdev write zeroes read block ...passed 00:10:08.279 Test: blockdev write zeroes read no split ...passed 00:10:08.538 Test: blockdev write zeroes read split ...passed 00:10:08.538 Test: blockdev write zeroes read split partial ...passed 00:10:08.538 Test: blockdev reset ...passed 00:10:08.538 Test: blockdev write read 8 blocks ...passed 00:10:08.538 Test: blockdev write read size > 128k ...passed 00:10:08.538 Test: blockdev write read invalid size ...passed 00:10:08.538 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.538 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.538 Test: blockdev write read max offset ...passed 00:10:08.538 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.538 Test: blockdev writev readv 8 blocks ...passed 00:10:08.538 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.538 Test: blockdev writev readv block ...passed 00:10:08.538 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc2p6 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc2p5 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc2p4 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc2p3 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc2p2 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc2p1 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc2p0 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc1p1 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc1p0 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 Suite: bdevio tests on: Malloc0 00:10:08.539 Test: blockdev write read block ...passed 00:10:08.539 Test: blockdev write zeroes read block ...passed 00:10:08.539 Test: blockdev write zeroes read no split ...passed 00:10:08.539 Test: blockdev write zeroes read split ...passed 00:10:08.539 Test: blockdev write zeroes read split partial ...passed 00:10:08.539 Test: blockdev reset ...passed 00:10:08.539 Test: blockdev write read 8 blocks ...passed 00:10:08.539 Test: blockdev write read size > 128k ...passed 00:10:08.539 Test: blockdev write read invalid size ...passed 00:10:08.539 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:10:08.539 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:10:08.539 Test: blockdev write read max offset ...passed 00:10:08.539 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:10:08.539 Test: blockdev writev readv 8 blocks ...passed 00:10:08.539 Test: blockdev writev readv 30 x 1block ...passed 00:10:08.539 Test: blockdev writev readv block ...passed 00:10:08.539 Test: blockdev writev readv size > 128k ...passed 00:10:08.539 Test: blockdev writev readv size > 128k in two iovs ...passed 00:10:08.539 Test: blockdev comparev and writev ...passed 00:10:08.539 Test: blockdev nvme passthru rw ...passed 00:10:08.539 Test: blockdev nvme passthru vendor specific ...passed 00:10:08.539 Test: blockdev nvme admin passthru ...passed 00:10:08.539 Test: blockdev copy ...passed 00:10:08.539 00:10:08.539 Run Summary: Type Total Ran Passed Failed Inactive 00:10:08.540 suites 16 16 n/a 0 0 00:10:08.540 tests 368 368 368 0 0 00:10:08.540 asserts 2224 2224 2224 0 n/a 00:10:08.540 00:10:08.540 Elapsed time = 0.669 seconds 00:10:08.540 0 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1436430 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1436430 ']' 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1436430 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1436430 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1436430' 00:10:08.540 killing process with pid 1436430 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1436430 00:10:08.540 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1436430 00:10:09.107 11:51:22 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:10:09.107 00:10:09.107 real 0m1.819s 00:10:09.107 user 0m4.560s 00:10:09.107 sys 0m0.496s 00:10:09.107 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:09.107 11:51:22 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:10:09.107 ************************************ 00:10:09.107 END TEST bdev_bounds 00:10:09.107 ************************************ 00:10:09.107 11:51:22 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:09.107 11:51:22 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:10:09.107 11:51:22 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:10:09.107 11:51:22 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:09.107 11:51:22 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:09.107 ************************************ 00:10:09.107 START TEST bdev_nbd 00:10:09.107 ************************************ 00:10:09.107 11:51:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:10:09.107 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:10:09.107 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:10:09.107 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:09.107 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:09.107 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1436806 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1436806 /var/tmp/spdk-nbd.sock 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1436806 ']' 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:10:09.108 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:10:09.108 11:51:22 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:10:09.108 [2024-07-15 11:51:22.557206] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:10:09.108 [2024-07-15 11:51:22.557253] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:09.108 [2024-07-15 11:51:22.662038] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:09.367 [2024-07-15 11:51:22.762130] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:09.367 [2024-07-15 11:51:22.922319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:09.367 [2024-07-15 11:51:22.922385] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:09.367 [2024-07-15 11:51:22.922399] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:09.367 [2024-07-15 11:51:22.930327] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:09.367 [2024-07-15 11:51:22.930356] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:09.367 [2024-07-15 11:51:22.938338] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:09.367 [2024-07-15 11:51:22.938369] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:09.626 [2024-07-15 11:51:23.015382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:09.626 [2024-07-15 11:51:23.015436] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:09.626 [2024-07-15 11:51:23.015452] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf426f0 00:10:09.626 [2024-07-15 11:51:23.015465] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:09.626 [2024-07-15 11:51:23.016908] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:09.626 [2024-07-15 11:51:23.016939] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:10.194 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:10.453 1+0 records in 00:10:10.453 1+0 records out 00:10:10.453 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245849 s, 16.7 MB/s 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:10.453 11:51:23 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:10:10.453 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:10.713 1+0 records in 00:10:10.713 1+0 records out 00:10:10.713 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000329881 s, 12.4 MB/s 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:10.713 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:10:10.972 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:10:10.972 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:10:10.972 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:10:10.972 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:10:10.972 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:10.972 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:10.973 1+0 records in 00:10:10.973 1+0 records out 00:10:10.973 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245653 s, 16.7 MB/s 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:10.973 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:11.232 1+0 records in 00:10:11.232 1+0 records out 00:10:11.232 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000324725 s, 12.6 MB/s 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:11.232 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:11.492 1+0 records in 00:10:11.492 1+0 records out 00:10:11.492 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000358325 s, 11.4 MB/s 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:11.492 11:51:24 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:11.751 1+0 records in 00:10:11.751 1+0 records out 00:10:11.751 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000399433 s, 10.3 MB/s 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:11.751 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:12.010 1+0 records in 00:10:12.010 1+0 records out 00:10:12.010 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000409547 s, 10.0 MB/s 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:12.010 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:12.270 1+0 records in 00:10:12.270 1+0 records out 00:10:12.270 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000523997 s, 7.8 MB/s 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:12.270 11:51:25 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:12.530 1+0 records in 00:10:12.530 1+0 records out 00:10:12.530 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000468624 s, 8.7 MB/s 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:12.530 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:12.790 1+0 records in 00:10:12.790 1+0 records out 00:10:12.790 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000440437 s, 9.3 MB/s 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:12.790 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:13.049 1+0 records in 00:10:13.049 1+0 records out 00:10:13.049 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000577568 s, 7.1 MB/s 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.049 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:13.050 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.050 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:13.050 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:13.050 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:13.050 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:13.050 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:13.308 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:13.309 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:13.309 1+0 records in 00:10:13.309 1+0 records out 00:10:13.309 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00049434 s, 8.3 MB/s 00:10:13.567 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.567 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:13.567 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.567 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:13.567 11:51:26 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:13.567 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:13.567 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:13.567 11:51:26 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:13.825 1+0 records in 00:10:13.825 1+0 records out 00:10:13.825 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00077016 s, 5.3 MB/s 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:13.825 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:14.083 1+0 records in 00:10:14.083 1+0 records out 00:10:14.083 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000758272 s, 5.4 MB/s 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:14.083 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:10:14.342 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:10:14.342 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:10:14.342 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:14.343 1+0 records in 00:10:14.343 1+0 records out 00:10:14.343 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000861352 s, 4.8 MB/s 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:14.343 11:51:27 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:14.602 1+0 records in 00:10:14.602 1+0 records out 00:10:14.602 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000547482 s, 7.5 MB/s 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:10:14.602 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:14.861 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd0", 00:10:14.861 "bdev_name": "Malloc0" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd1", 00:10:14.861 "bdev_name": "Malloc1p0" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd2", 00:10:14.861 "bdev_name": "Malloc1p1" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd3", 00:10:14.861 "bdev_name": "Malloc2p0" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd4", 00:10:14.861 "bdev_name": "Malloc2p1" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd5", 00:10:14.861 "bdev_name": "Malloc2p2" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd6", 00:10:14.861 "bdev_name": "Malloc2p3" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd7", 00:10:14.861 "bdev_name": "Malloc2p4" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd8", 00:10:14.861 "bdev_name": "Malloc2p5" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd9", 00:10:14.861 "bdev_name": "Malloc2p6" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd10", 00:10:14.861 "bdev_name": "Malloc2p7" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd11", 00:10:14.861 "bdev_name": "TestPT" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd12", 00:10:14.861 "bdev_name": "raid0" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd13", 00:10:14.861 "bdev_name": "concat0" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd14", 00:10:14.861 "bdev_name": "raid1" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd15", 00:10:14.861 "bdev_name": "AIO0" 00:10:14.861 } 00:10:14.861 ]' 00:10:14.861 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:10:14.861 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:10:14.861 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd0", 00:10:14.861 "bdev_name": "Malloc0" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd1", 00:10:14.861 "bdev_name": "Malloc1p0" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd2", 00:10:14.861 "bdev_name": "Malloc1p1" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd3", 00:10:14.861 "bdev_name": "Malloc2p0" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd4", 00:10:14.861 "bdev_name": "Malloc2p1" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd5", 00:10:14.861 "bdev_name": "Malloc2p2" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd6", 00:10:14.861 "bdev_name": "Malloc2p3" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd7", 00:10:14.861 "bdev_name": "Malloc2p4" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd8", 00:10:14.861 "bdev_name": "Malloc2p5" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd9", 00:10:14.861 "bdev_name": "Malloc2p6" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd10", 00:10:14.861 "bdev_name": "Malloc2p7" 00:10:14.861 }, 00:10:14.861 { 00:10:14.861 "nbd_device": "/dev/nbd11", 00:10:14.861 "bdev_name": "TestPT" 00:10:14.861 }, 00:10:14.862 { 00:10:14.862 "nbd_device": "/dev/nbd12", 00:10:14.862 "bdev_name": "raid0" 00:10:14.862 }, 00:10:14.862 { 00:10:14.862 "nbd_device": "/dev/nbd13", 00:10:14.862 "bdev_name": "concat0" 00:10:14.862 }, 00:10:14.862 { 00:10:14.862 "nbd_device": "/dev/nbd14", 00:10:14.862 "bdev_name": "raid1" 00:10:14.862 }, 00:10:14.862 { 00:10:14.862 "nbd_device": "/dev/nbd15", 00:10:14.862 "bdev_name": "AIO0" 00:10:14.862 } 00:10:14.862 ]' 00:10:14.862 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:10:14.862 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:14.862 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:10:14.862 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:14.862 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:14.862 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:14.862 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:15.121 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:15.380 11:51:28 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:15.639 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:15.899 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:10:16.158 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:10:16.158 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:10:16.158 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:10:16.158 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:16.158 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:16.158 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:10:16.417 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:16.417 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:16.417 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:16.417 11:51:29 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:16.677 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:16.937 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.197 11:51:30 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:17.764 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:17.765 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:10:17.765 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:17.765 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:17.765 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:17.765 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:18.024 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:18.287 11:51:31 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:10:18.636 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:10:18.636 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:10:18.636 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:10:18.636 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:18.637 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:18.637 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:10:18.637 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:18.637 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:18.637 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:18.637 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:18.896 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:19.155 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:19.414 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:19.414 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:19.414 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:19.414 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:19.414 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:10:19.414 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:19.414 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:10:19.414 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:10:19.414 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:19.415 11:51:32 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:10:19.683 /dev/nbd0 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:19.683 1+0 records in 00:10:19.683 1+0 records out 00:10:19.683 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233851 s, 17.5 MB/s 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:19.683 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:10:19.943 /dev/nbd1 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:19.943 1+0 records in 00:10:19.943 1+0 records out 00:10:19.943 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312238 s, 13.1 MB/s 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:19.943 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:10:20.202 /dev/nbd10 00:10:20.202 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:10:20.202 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:10:20.202 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:10:20.202 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:20.202 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:20.202 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:20.202 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:20.460 1+0 records in 00:10:20.460 1+0 records out 00:10:20.460 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333941 s, 12.3 MB/s 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:20.460 11:51:33 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:10:20.460 /dev/nbd11 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:20.719 1+0 records in 00:10:20.719 1+0 records out 00:10:20.719 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000387082 s, 10.6 MB/s 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:20.719 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:10:20.978 /dev/nbd12 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd12 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd12 /proc/partitions 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:20.978 1+0 records in 00:10:20.978 1+0 records out 00:10:20.978 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000375003 s, 10.9 MB/s 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:20.978 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:10:21.237 /dev/nbd13 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd13 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd13 /proc/partitions 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:21.237 1+0 records in 00:10:21.237 1+0 records out 00:10:21.237 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000423428 s, 9.7 MB/s 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:21.237 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:10:21.496 /dev/nbd14 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd14 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd14 /proc/partitions 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:21.496 1+0 records in 00:10:21.496 1+0 records out 00:10:21.496 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000465687 s, 8.8 MB/s 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:21.496 11:51:34 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:10:21.755 /dev/nbd15 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd15 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd15 /proc/partitions 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:21.755 1+0 records in 00:10:21.755 1+0 records out 00:10:21.755 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000536906 s, 7.6 MB/s 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:21.755 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:10:22.014 /dev/nbd2 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:22.014 1+0 records in 00:10:22.014 1+0 records out 00:10:22.014 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000525427 s, 7.8 MB/s 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:22.014 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:10:22.273 /dev/nbd3 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:22.273 1+0 records in 00:10:22.273 1+0 records out 00:10:22.273 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000536376 s, 7.6 MB/s 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:22.273 11:51:35 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:10:22.532 /dev/nbd4 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd4 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd4 /proc/partitions 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:22.532 1+0 records in 00:10:22.532 1+0 records out 00:10:22.532 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000628182 s, 6.5 MB/s 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:22.532 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:10:22.791 /dev/nbd5 00:10:22.791 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:10:22.791 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:10:22.791 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd5 00:10:22.791 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd5 /proc/partitions 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:23.050 1+0 records in 00:10:23.050 1+0 records out 00:10:23.050 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000538851 s, 7.6 MB/s 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:23.050 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:10:23.309 /dev/nbd6 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd6 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd6 /proc/partitions 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:23.309 1+0 records in 00:10:23.309 1+0 records out 00:10:23.309 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00071406 s, 5.7 MB/s 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:10:23.309 /dev/nbd7 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd7 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd7 /proc/partitions 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:23.309 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:23.568 1+0 records in 00:10:23.568 1+0 records out 00:10:23.568 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000612258 s, 6.7 MB/s 00:10:23.568 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.568 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:23.568 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.568 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:23.568 11:51:36 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:23.568 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:23.568 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:23.568 11:51:36 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:10:23.568 /dev/nbd8 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd8 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd8 /proc/partitions 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:23.827 1+0 records in 00:10:23.827 1+0 records out 00:10:23.827 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000865123 s, 4.7 MB/s 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:10:23.827 /dev/nbd9 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:10:23.827 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd9 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd9 /proc/partitions 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:10:24.087 1+0 records in 00:10:24.087 1+0 records out 00:10:24.087 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000833971 s, 4.9 MB/s 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:24.087 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:24.347 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:10:24.347 { 00:10:24.347 "nbd_device": "/dev/nbd0", 00:10:24.347 "bdev_name": "Malloc0" 00:10:24.347 }, 00:10:24.347 { 00:10:24.347 "nbd_device": "/dev/nbd1", 00:10:24.347 "bdev_name": "Malloc1p0" 00:10:24.347 }, 00:10:24.347 { 00:10:24.347 "nbd_device": "/dev/nbd10", 00:10:24.347 "bdev_name": "Malloc1p1" 00:10:24.347 }, 00:10:24.347 { 00:10:24.348 "nbd_device": "/dev/nbd11", 00:10:24.348 "bdev_name": "Malloc2p0" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd12", 00:10:24.348 "bdev_name": "Malloc2p1" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd13", 00:10:24.348 "bdev_name": "Malloc2p2" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd14", 00:10:24.348 "bdev_name": "Malloc2p3" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd15", 00:10:24.348 "bdev_name": "Malloc2p4" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd2", 00:10:24.348 "bdev_name": "Malloc2p5" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd3", 00:10:24.348 "bdev_name": "Malloc2p6" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd4", 00:10:24.348 "bdev_name": "Malloc2p7" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd5", 00:10:24.348 "bdev_name": "TestPT" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd6", 00:10:24.348 "bdev_name": "raid0" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd7", 00:10:24.348 "bdev_name": "concat0" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd8", 00:10:24.348 "bdev_name": "raid1" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd9", 00:10:24.348 "bdev_name": "AIO0" 00:10:24.348 } 00:10:24.348 ]' 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd0", 00:10:24.348 "bdev_name": "Malloc0" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd1", 00:10:24.348 "bdev_name": "Malloc1p0" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd10", 00:10:24.348 "bdev_name": "Malloc1p1" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd11", 00:10:24.348 "bdev_name": "Malloc2p0" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd12", 00:10:24.348 "bdev_name": "Malloc2p1" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd13", 00:10:24.348 "bdev_name": "Malloc2p2" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd14", 00:10:24.348 "bdev_name": "Malloc2p3" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd15", 00:10:24.348 "bdev_name": "Malloc2p4" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd2", 00:10:24.348 "bdev_name": "Malloc2p5" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd3", 00:10:24.348 "bdev_name": "Malloc2p6" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd4", 00:10:24.348 "bdev_name": "Malloc2p7" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd5", 00:10:24.348 "bdev_name": "TestPT" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd6", 00:10:24.348 "bdev_name": "raid0" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd7", 00:10:24.348 "bdev_name": "concat0" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd8", 00:10:24.348 "bdev_name": "raid1" 00:10:24.348 }, 00:10:24.348 { 00:10:24.348 "nbd_device": "/dev/nbd9", 00:10:24.348 "bdev_name": "AIO0" 00:10:24.348 } 00:10:24.348 ]' 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:10:24.348 /dev/nbd1 00:10:24.348 /dev/nbd10 00:10:24.348 /dev/nbd11 00:10:24.348 /dev/nbd12 00:10:24.348 /dev/nbd13 00:10:24.348 /dev/nbd14 00:10:24.348 /dev/nbd15 00:10:24.348 /dev/nbd2 00:10:24.348 /dev/nbd3 00:10:24.348 /dev/nbd4 00:10:24.348 /dev/nbd5 00:10:24.348 /dev/nbd6 00:10:24.348 /dev/nbd7 00:10:24.348 /dev/nbd8 00:10:24.348 /dev/nbd9' 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:10:24.348 /dev/nbd1 00:10:24.348 /dev/nbd10 00:10:24.348 /dev/nbd11 00:10:24.348 /dev/nbd12 00:10:24.348 /dev/nbd13 00:10:24.348 /dev/nbd14 00:10:24.348 /dev/nbd15 00:10:24.348 /dev/nbd2 00:10:24.348 /dev/nbd3 00:10:24.348 /dev/nbd4 00:10:24.348 /dev/nbd5 00:10:24.348 /dev/nbd6 00:10:24.348 /dev/nbd7 00:10:24.348 /dev/nbd8 00:10:24.348 /dev/nbd9' 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:10:24.348 256+0 records in 00:10:24.348 256+0 records out 00:10:24.348 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0114626 s, 91.5 MB/s 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:24.348 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:10:24.608 256+0 records in 00:10:24.608 256+0 records out 00:10:24.608 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.181621 s, 5.8 MB/s 00:10:24.608 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:24.608 11:51:37 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:10:24.608 256+0 records in 00:10:24.608 256+0 records out 00:10:24.608 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184008 s, 5.7 MB/s 00:10:24.608 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:24.608 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:10:24.866 256+0 records in 00:10:24.866 256+0 records out 00:10:24.866 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.179005 s, 5.9 MB/s 00:10:24.866 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:24.866 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:10:25.124 256+0 records in 00:10:25.124 256+0 records out 00:10:25.124 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183424 s, 5.7 MB/s 00:10:25.124 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:25.124 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:10:25.124 256+0 records in 00:10:25.124 256+0 records out 00:10:25.124 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.150702 s, 7.0 MB/s 00:10:25.124 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:25.124 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:10:25.383 256+0 records in 00:10:25.383 256+0 records out 00:10:25.383 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183742 s, 5.7 MB/s 00:10:25.383 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:25.383 11:51:38 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:10:25.641 256+0 records in 00:10:25.641 256+0 records out 00:10:25.641 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.184045 s, 5.7 MB/s 00:10:25.641 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:25.641 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:10:25.641 256+0 records in 00:10:25.641 256+0 records out 00:10:25.641 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183725 s, 5.7 MB/s 00:10:25.641 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:25.641 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:10:25.899 256+0 records in 00:10:25.899 256+0 records out 00:10:25.899 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183691 s, 5.7 MB/s 00:10:25.899 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:25.899 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:10:26.157 256+0 records in 00:10:26.157 256+0 records out 00:10:26.157 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183989 s, 5.7 MB/s 00:10:26.157 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.157 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:10:26.416 256+0 records in 00:10:26.416 256+0 records out 00:10:26.416 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.182821 s, 5.7 MB/s 00:10:26.416 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.416 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:10:26.416 256+0 records in 00:10:26.416 256+0 records out 00:10:26.416 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183434 s, 5.7 MB/s 00:10:26.416 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.416 11:51:39 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:10:26.705 256+0 records in 00:10:26.705 256+0 records out 00:10:26.705 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.189893 s, 5.5 MB/s 00:10:26.705 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.705 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:10:26.964 256+0 records in 00:10:26.964 256+0 records out 00:10:26.964 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.186474 s, 5.6 MB/s 00:10:26.964 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:26.964 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:10:27.222 256+0 records in 00:10:27.222 256+0 records out 00:10:27.222 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.188773 s, 5.6 MB/s 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:10:27.222 256+0 records in 00:10:27.222 256+0 records out 00:10:27.222 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.183101 s, 5.7 MB/s 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:10:27.222 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.223 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:10:27.223 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.223 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:10:27.223 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.223 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:10:27.223 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.223 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:10:27.480 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:27.481 11:51:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:27.738 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:27.996 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:28.255 11:51:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:28.513 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:28.770 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:29.028 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:29.286 11:51:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:29.544 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:29.800 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:10:30.057 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:10:30.315 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:10:30.315 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:10:30.315 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:30.315 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:30.315 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:10:30.315 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:30.315 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:30.315 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:30.315 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:30.574 11:51:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:10:30.832 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:10:30.833 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:10:30.833 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:10:30.833 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:30.833 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:30.833 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:10:30.833 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:30.833 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:30.833 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:30.833 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:31.091 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:31.658 11:51:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:31.658 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:10:31.916 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:10:31.916 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:10:31.916 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:10:32.175 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:10:32.176 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:10:32.434 malloc_lvol_verify 00:10:32.434 11:51:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:10:32.693 5f2892e8-b510-4df8-afb3-bebe0b082e42 00:10:32.952 11:51:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:10:32.952 00cebe59-0567-4534-b7de-cc5ec3f18fca 00:10:33.210 11:51:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:10:33.468 /dev/nbd0 00:10:33.726 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:10:33.726 mke2fs 1.46.5 (30-Dec-2021) 00:10:33.726 Discarding device blocks: 0/4096 done 00:10:33.726 Creating filesystem with 4096 1k blocks and 1024 inodes 00:10:33.726 00:10:33.726 Allocating group tables: 0/1 done 00:10:33.726 Writing inode tables: 0/1 done 00:10:33.726 Creating journal (1024 blocks): done 00:10:33.727 Writing superblocks and filesystem accounting information: 0/1 done 00:10:33.727 00:10:33.727 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:10:33.727 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:10:33.727 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:10:33.727 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:10:33.727 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:10:33.727 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:10:33.727 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:10:33.727 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1436806 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1436806 ']' 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1436806 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1436806 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1436806' 00:10:34.294 killing process with pid 1436806 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1436806 00:10:34.294 11:51:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1436806 00:10:34.554 11:51:48 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:10:34.554 00:10:34.554 real 0m25.520s 00:10:34.554 user 0m31.668s 00:10:34.554 sys 0m14.400s 00:10:34.554 11:51:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:34.554 11:51:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:10:34.554 ************************************ 00:10:34.554 END TEST bdev_nbd 00:10:34.554 ************************************ 00:10:34.554 11:51:48 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:34.554 11:51:48 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:10:34.554 11:51:48 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:10:34.554 11:51:48 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:10:34.554 11:51:48 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:10:34.554 11:51:48 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:10:34.554 11:51:48 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:34.554 11:51:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:34.554 ************************************ 00:10:34.554 START TEST bdev_fio 00:10:34.554 ************************************ 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:34.554 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:10:34.554 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:10:34.813 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:34.814 11:51:48 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:34.814 ************************************ 00:10:34.814 START TEST bdev_fio_rw_verify 00:10:34.814 ************************************ 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:34.814 11:51:48 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:35.072 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.072 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.072 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.072 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.072 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.072 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.072 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:35.073 fio-3.35 00:10:35.073 Starting 16 threads 00:10:47.273 00:10:47.274 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=1440818: Mon Jul 15 11:51:59 2024 00:10:47.274 read: IOPS=80.3k, BW=313MiB/s (329MB/s)(3135MiB/10001msec) 00:10:47.274 slat (usec): min=4, max=481, avg=39.85, stdev=14.59 00:10:47.274 clat (usec): min=7, max=1107, avg=321.63, stdev=140.98 00:10:47.274 lat (usec): min=18, max=1208, avg=361.49, stdev=149.72 00:10:47.274 clat percentiles (usec): 00:10:47.274 | 50.000th=[ 314], 99.000th=[ 676], 99.900th=[ 824], 99.990th=[ 922], 00:10:47.274 | 99.999th=[ 1057] 00:10:47.274 write: IOPS=129k, BW=504MiB/s (529MB/s)(4981MiB/9874msec); 0 zone resets 00:10:47.274 slat (usec): min=9, max=4029, avg=53.49, stdev=16.87 00:10:47.274 clat (usec): min=16, max=1627, avg=374.80, stdev=168.86 00:10:47.274 lat (usec): min=34, max=4497, avg=428.29, stdev=177.99 00:10:47.274 clat percentiles (usec): 00:10:47.274 | 50.000th=[ 355], 99.000th=[ 865], 99.900th=[ 1057], 99.990th=[ 1156], 00:10:47.274 | 99.999th=[ 1254] 00:10:47.274 bw ( KiB/s): min=412888, max=682048, per=98.90%, avg=510842.89, stdev=4837.99, samples=304 00:10:47.274 iops : min=103222, max=170511, avg=127710.63, stdev=1209.48, samples=304 00:10:47.274 lat (usec) : 10=0.01%, 20=0.01%, 50=0.29%, 100=2.64%, 250=26.20% 00:10:47.274 lat (usec) : 500=53.12%, 750=16.14%, 1000=1.44% 00:10:47.274 lat (msec) : 2=0.16% 00:10:47.274 cpu : usr=99.18%, sys=0.38%, ctx=630, majf=0, minf=1963 00:10:47.274 IO depths : 1=12.6%, 2=25.1%, 4=49.9%, 8=12.4%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:47.274 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:47.274 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:47.274 issued rwts: total=802634,1275045,0,0 short=0,0,0,0 dropped=0,0,0,0 00:10:47.274 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:47.274 00:10:47.274 Run status group 0 (all jobs): 00:10:47.274 READ: bw=313MiB/s (329MB/s), 313MiB/s-313MiB/s (329MB/s-329MB/s), io=3135MiB (3288MB), run=10001-10001msec 00:10:47.274 WRITE: bw=504MiB/s (529MB/s), 504MiB/s-504MiB/s (529MB/s-529MB/s), io=4981MiB (5223MB), run=9874-9874msec 00:10:47.274 00:10:47.274 real 0m11.779s 00:10:47.274 user 2m45.994s 00:10:47.274 sys 0m1.218s 00:10:47.274 11:52:00 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:47.274 11:52:00 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:10:47.274 ************************************ 00:10:47.274 END TEST bdev_fio_rw_verify 00:10:47.274 ************************************ 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:10:47.274 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:47.275 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ab12ddf7-10cb-4706-b517-b248bc2afc4d"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ab12ddf7-10cb-4706-b517-b248bc2afc4d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "2253fc13-25f2-52e2-b62d-6fd61864c6f9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "2253fc13-25f2-52e2-b62d-6fd61864c6f9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "dc788c59-8f42-55b6-8ed5-1ff0432fa21c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dc788c59-8f42-55b6-8ed5-1ff0432fa21c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "15c91d20-bbd4-5ede-a89d-6661c16a5ded"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "15c91d20-bbd4-5ede-a89d-6661c16a5ded",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "4a3867bc-2413-5fbc-accb-d3dfb13c0b7d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4a3867bc-2413-5fbc-accb-d3dfb13c0b7d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "f7cf4fcd-1cdd-50c0-ba32-923a741b9612"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f7cf4fcd-1cdd-50c0-ba32-923a741b9612",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "85920319-cf12-53ae-8c1c-1720b376fae8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "85920319-cf12-53ae-8c1c-1720b376fae8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "b2abe9f1-8d7a-5526-9092-30565e923c78"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b2abe9f1-8d7a-5526-9092-30565e923c78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "2e0850d1-5cdb-5ad8-b345-677c96a9baa9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2e0850d1-5cdb-5ad8-b345-677c96a9baa9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "d58b8176-7bed-5190-852a-a4bade7b7069"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d58b8176-7bed-5190-852a-a4bade7b7069",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "bf618af8-6c69-577e-b205-471eca47978a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bf618af8-6c69-577e-b205-471eca47978a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "482c6328-8eef-5d90-a5cd-eb71d288d612"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "482c6328-8eef-5d90-a5cd-eb71d288d612",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "30e250e7-da5e-4319-8706-27566513e1e3"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "30e250e7-da5e-4319-8706-27566513e1e3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "30e250e7-da5e-4319-8706-27566513e1e3",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "dc2095c8-8690-4dcf-97d0-8b99c775ed60",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "db5e3a0f-26e4-4c55-8637-79b9a3ee42ea",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "8f77d098-b4fe-4084-8986-edc119c38216"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "8f77d098-b4fe-4084-8986-edc119c38216",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "8f77d098-b4fe-4084-8986-edc119c38216",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "d16d9db0-3e4b-49d8-9f3d-ffddbc74d623",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "557ebcf3-70cc-413b-80ea-2b2360cfc9e1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "9d6ca8c1-6620-4d73-ac32-e8a1ebf05c8e"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9d6ca8c1-6620-4d73-ac32-e8a1ebf05c8e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "9d6ca8c1-6620-4d73-ac32-e8a1ebf05c8e",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "862d35fb-c5fd-4811-9008-b6281e1db460",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "abc4f20c-6db1-4e15-b8f3-add206282ceb",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "bc8ff510-e2b8-4026-9746-d14bf6770e02"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "bc8ff510-e2b8-4026-9746-d14bf6770e02",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:47.275 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:10:47.275 Malloc1p0 00:10:47.275 Malloc1p1 00:10:47.275 Malloc2p0 00:10:47.275 Malloc2p1 00:10:47.275 Malloc2p2 00:10:47.275 Malloc2p3 00:10:47.275 Malloc2p4 00:10:47.275 Malloc2p5 00:10:47.275 Malloc2p6 00:10:47.275 Malloc2p7 00:10:47.275 TestPT 00:10:47.275 raid0 00:10:47.275 concat0 ]] 00:10:47.275 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:10:47.276 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ab12ddf7-10cb-4706-b517-b248bc2afc4d"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ab12ddf7-10cb-4706-b517-b248bc2afc4d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "2253fc13-25f2-52e2-b62d-6fd61864c6f9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "2253fc13-25f2-52e2-b62d-6fd61864c6f9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "dc788c59-8f42-55b6-8ed5-1ff0432fa21c"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "dc788c59-8f42-55b6-8ed5-1ff0432fa21c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "15c91d20-bbd4-5ede-a89d-6661c16a5ded"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "15c91d20-bbd4-5ede-a89d-6661c16a5ded",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "4a3867bc-2413-5fbc-accb-d3dfb13c0b7d"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "4a3867bc-2413-5fbc-accb-d3dfb13c0b7d",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "f7cf4fcd-1cdd-50c0-ba32-923a741b9612"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f7cf4fcd-1cdd-50c0-ba32-923a741b9612",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "85920319-cf12-53ae-8c1c-1720b376fae8"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "85920319-cf12-53ae-8c1c-1720b376fae8",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "b2abe9f1-8d7a-5526-9092-30565e923c78"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b2abe9f1-8d7a-5526-9092-30565e923c78",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "2e0850d1-5cdb-5ad8-b345-677c96a9baa9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "2e0850d1-5cdb-5ad8-b345-677c96a9baa9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "d58b8176-7bed-5190-852a-a4bade7b7069"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d58b8176-7bed-5190-852a-a4bade7b7069",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "bf618af8-6c69-577e-b205-471eca47978a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "bf618af8-6c69-577e-b205-471eca47978a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "482c6328-8eef-5d90-a5cd-eb71d288d612"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "482c6328-8eef-5d90-a5cd-eb71d288d612",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "30e250e7-da5e-4319-8706-27566513e1e3"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "30e250e7-da5e-4319-8706-27566513e1e3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "30e250e7-da5e-4319-8706-27566513e1e3",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "dc2095c8-8690-4dcf-97d0-8b99c775ed60",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "db5e3a0f-26e4-4c55-8637-79b9a3ee42ea",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "8f77d098-b4fe-4084-8986-edc119c38216"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "8f77d098-b4fe-4084-8986-edc119c38216",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "8f77d098-b4fe-4084-8986-edc119c38216",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "d16d9db0-3e4b-49d8-9f3d-ffddbc74d623",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "557ebcf3-70cc-413b-80ea-2b2360cfc9e1",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "9d6ca8c1-6620-4d73-ac32-e8a1ebf05c8e"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9d6ca8c1-6620-4d73-ac32-e8a1ebf05c8e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "9d6ca8c1-6620-4d73-ac32-e8a1ebf05c8e",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "862d35fb-c5fd-4811-9008-b6281e1db460",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "abc4f20c-6db1-4e15-b8f3-add206282ceb",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "bc8ff510-e2b8-4026-9746-d14bf6770e02"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "bc8ff510-e2b8-4026-9746-d14bf6770e02",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:10:47.276 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.276 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:10:47.276 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:10:47.276 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.276 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:10:47.276 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:10:47.276 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:47.277 11:52:00 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:47.277 ************************************ 00:10:47.277 START TEST bdev_fio_trim 00:10:47.277 ************************************ 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:10:47.277 11:52:00 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:10:47.277 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:10:47.277 fio-3.35 00:10:47.277 Starting 14 threads 00:10:59.562 00:10:59.562 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=1442527: Mon Jul 15 11:52:11 2024 00:10:59.562 write: IOPS=123k, BW=480MiB/s (503MB/s)(4799MiB/10001msec); 0 zone resets 00:10:59.562 slat (usec): min=3, max=573, avg=40.00, stdev=11.59 00:10:59.562 clat (usec): min=34, max=3657, avg=287.38, stdev=101.44 00:10:59.562 lat (usec): min=45, max=3693, avg=327.38, stdev=106.76 00:10:59.562 clat percentiles (usec): 00:10:59.562 | 50.000th=[ 277], 99.000th=[ 562], 99.900th=[ 668], 99.990th=[ 734], 00:10:59.562 | 99.999th=[ 922] 00:10:59.562 bw ( KiB/s): min=422006, max=651182, per=100.00%, avg=493791.68, stdev=4207.25, samples=266 00:10:59.562 iops : min=105501, max=162795, avg=123447.74, stdev=1051.81, samples=266 00:10:59.562 trim: IOPS=123k, BW=480MiB/s (503MB/s)(4799MiB/10001msec); 0 zone resets 00:10:59.562 slat (usec): min=4, max=1166, avg=26.94, stdev= 7.76 00:10:59.562 clat (usec): min=5, max=3693, avg=322.65, stdev=111.61 00:10:59.562 lat (usec): min=16, max=3717, avg=349.59, stdev=115.65 00:10:59.562 clat percentiles (usec): 00:10:59.562 | 50.000th=[ 314], 99.000th=[ 619], 99.900th=[ 725], 99.990th=[ 799], 00:10:59.562 | 99.999th=[ 979] 00:10:59.562 bw ( KiB/s): min=422006, max=651190, per=100.00%, avg=493792.11, stdev=4207.38, samples=266 00:10:59.562 iops : min=105501, max=162797, avg=123447.95, stdev=1051.84, samples=266 00:10:59.562 lat (usec) : 10=0.01%, 20=0.01%, 50=0.04%, 100=0.77%, 250=33.58% 00:10:59.562 lat (usec) : 500=61.07%, 750=4.51%, 1000=0.03% 00:10:59.562 lat (msec) : 2=0.01%, 4=0.01% 00:10:59.562 cpu : usr=99.56%, sys=0.01%, ctx=525, majf=0, minf=1116 00:10:59.562 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:10:59.562 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:59.562 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:10:59.562 issued rwts: total=0,1228478,1228481,0 short=0,0,0,0 dropped=0,0,0,0 00:10:59.562 latency : target=0, window=0, percentile=100.00%, depth=8 00:10:59.562 00:10:59.562 Run status group 0 (all jobs): 00:10:59.562 WRITE: bw=480MiB/s (503MB/s), 480MiB/s-480MiB/s (503MB/s-503MB/s), io=4799MiB (5032MB), run=10001-10001msec 00:10:59.562 TRIM: bw=480MiB/s (503MB/s), 480MiB/s-480MiB/s (503MB/s-503MB/s), io=4799MiB (5032MB), run=10001-10001msec 00:10:59.562 00:10:59.562 real 0m11.740s 00:10:59.562 user 2m26.061s 00:10:59.562 sys 0m0.782s 00:10:59.562 11:52:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:59.562 11:52:11 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:10:59.562 ************************************ 00:10:59.562 END TEST bdev_fio_trim 00:10:59.562 ************************************ 00:10:59.563 11:52:11 blockdev_general.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:10:59.563 11:52:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:10:59.563 11:52:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:10:59.563 11:52:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:10:59.563 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:10:59.563 11:52:11 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:10:59.563 00:10:59.563 real 0m23.901s 00:10:59.563 user 5m12.271s 00:10:59.563 sys 0m2.199s 00:10:59.563 11:52:11 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:10:59.563 11:52:11 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:10:59.563 ************************************ 00:10:59.563 END TEST bdev_fio 00:10:59.563 ************************************ 00:10:59.563 11:52:12 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:10:59.563 11:52:12 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:10:59.563 11:52:12 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:59.563 11:52:12 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:10:59.563 11:52:12 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:10:59.563 11:52:12 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:59.563 ************************************ 00:10:59.563 START TEST bdev_verify 00:10:59.563 ************************************ 00:10:59.563 11:52:12 blockdev_general.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:10:59.563 [2024-07-15 11:52:12.149036] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:10:59.563 [2024-07-15 11:52:12.149100] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1443977 ] 00:10:59.563 [2024-07-15 11:52:12.267230] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:59.563 [2024-07-15 11:52:12.369130] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:59.563 [2024-07-15 11:52:12.369136] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:59.563 [2024-07-15 11:52:12.534219] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:59.563 [2024-07-15 11:52:12.534273] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:59.563 [2024-07-15 11:52:12.534288] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:59.563 [2024-07-15 11:52:12.542228] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:59.563 [2024-07-15 11:52:12.542256] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:59.563 [2024-07-15 11:52:12.550244] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:59.563 [2024-07-15 11:52:12.550268] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:59.563 [2024-07-15 11:52:12.627618] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:59.563 [2024-07-15 11:52:12.627671] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:59.563 [2024-07-15 11:52:12.627702] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e86ec0 00:10:59.563 [2024-07-15 11:52:12.627716] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:59.563 [2024-07-15 11:52:12.629176] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:59.563 [2024-07-15 11:52:12.629205] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:59.563 Running I/O for 5 seconds... 00:11:04.855 00:11:04.855 Latency(us) 00:11:04.856 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:04.856 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x1000 00:11:04.856 Malloc0 : 5.15 1168.23 4.56 0.00 0.00 109359.53 477.27 346485.98 00:11:04.856 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x1000 length 0x1000 00:11:04.856 Malloc0 : 5.15 945.23 3.69 0.00 0.00 135117.64 673.17 404841.52 00:11:04.856 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x800 00:11:04.856 Malloc1p0 : 5.20 615.71 2.41 0.00 0.00 206946.07 2464.72 175978.41 00:11:04.856 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x800 length 0x800 00:11:04.856 Malloc1p0 : 5.15 497.22 1.94 0.00 0.00 256087.09 3177.07 219745.06 00:11:04.856 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x800 00:11:04.856 Malloc1p1 : 5.20 615.46 2.40 0.00 0.00 206612.20 2450.48 174154.80 00:11:04.856 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x800 length 0x800 00:11:04.856 Malloc1p1 : 5.15 496.91 1.94 0.00 0.00 255612.88 3162.82 219745.06 00:11:04.856 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x200 00:11:04.856 Malloc2p0 : 5.20 615.20 2.40 0.00 0.00 206269.83 2436.23 175066.60 00:11:04.856 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x200 length 0x200 00:11:04.856 Malloc2p0 : 5.15 496.63 1.94 0.00 0.00 255122.48 3789.69 218833.25 00:11:04.856 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x200 00:11:04.856 Malloc2p1 : 5.20 614.95 2.40 0.00 0.00 205923.66 3262.55 171419.38 00:11:04.856 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x200 length 0x200 00:11:04.856 Malloc2p1 : 5.16 496.35 1.94 0.00 0.00 254490.33 4103.12 216097.84 00:11:04.856 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x200 00:11:04.856 Malloc2p2 : 5.21 614.70 2.40 0.00 0.00 205504.30 3519.00 166860.35 00:11:04.856 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x200 length 0x200 00:11:04.856 Malloc2p2 : 5.16 496.06 1.94 0.00 0.00 253809.75 3319.54 213362.42 00:11:04.856 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x200 00:11:04.856 Malloc2p3 : 5.21 614.45 2.40 0.00 0.00 205060.14 2635.69 165036.74 00:11:04.856 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x200 length 0x200 00:11:04.856 Malloc2p3 : 5.16 495.78 1.94 0.00 0.00 253295.10 3162.82 212450.62 00:11:04.856 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x200 00:11:04.856 Malloc2p4 : 5.21 614.21 2.40 0.00 0.00 204709.93 2464.72 165948.55 00:11:04.856 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x200 length 0x200 00:11:04.856 Malloc2p4 : 5.17 495.50 1.94 0.00 0.00 252816.71 3732.70 212450.62 00:11:04.856 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x200 00:11:04.856 Malloc2p5 : 5.21 613.96 2.40 0.00 0.00 204361.36 2478.97 167772.16 00:11:04.856 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x200 length 0x200 00:11:04.856 Malloc2p5 : 5.25 512.10 2.00 0.00 0.00 243987.96 4188.61 210627.01 00:11:04.856 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x200 00:11:04.856 Malloc2p6 : 5.21 613.71 2.40 0.00 0.00 204014.68 3177.07 166860.35 00:11:04.856 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x200 length 0x200 00:11:04.856 Malloc2p6 : 5.25 511.86 2.00 0.00 0.00 243324.68 2977.61 208803.39 00:11:04.856 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x200 00:11:04.856 Malloc2p7 : 5.22 613.45 2.40 0.00 0.00 203609.45 3604.48 163213.13 00:11:04.856 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x200 length 0x200 00:11:04.856 Malloc2p7 : 5.25 511.62 2.00 0.00 0.00 242762.37 3504.75 202420.76 00:11:04.856 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x1000 00:11:04.856 TestPT : 5.23 611.97 2.39 0.00 0.00 203481.23 9459.98 162301.33 00:11:04.856 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x1000 length 0x1000 00:11:04.856 TestPT : 5.24 488.94 1.91 0.00 0.00 253230.93 37839.92 204244.37 00:11:04.856 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x2000 00:11:04.856 raid0 : 5.22 612.92 2.39 0.00 0.00 202692.94 3006.11 152271.47 00:11:04.856 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x2000 length 0x2000 00:11:04.856 raid0 : 5.26 511.37 2.00 0.00 0.00 241532.07 3433.52 187831.87 00:11:04.856 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x2000 00:11:04.856 concat0 : 5.22 612.68 2.39 0.00 0.00 202294.65 2336.50 155918.69 00:11:04.856 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x2000 length 0x2000 00:11:04.856 concat0 : 5.26 511.13 2.00 0.00 0.00 240874.41 3390.78 188743.68 00:11:04.856 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x1000 00:11:04.856 raid1 : 5.23 612.43 2.39 0.00 0.00 201933.07 3077.34 163213.13 00:11:04.856 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x1000 length 0x1000 00:11:04.856 raid1 : 5.26 510.88 2.00 0.00 0.00 240195.37 4188.61 192390.90 00:11:04.856 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x0 length 0x4e2 00:11:04.856 AIO0 : 5.23 612.23 2.39 0.00 0.00 201532.80 1645.52 171419.38 00:11:04.856 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:11:04.856 Verification LBA range: start 0x4e2 length 0x4e2 00:11:04.856 AIO0 : 5.26 510.69 1.99 0.00 0.00 239531.79 1659.77 196038.12 00:11:04.856 =================================================================================================================== 00:11:04.856 Total : 18864.53 73.69 0.00 0.00 212677.59 477.27 404841.52 00:11:05.116 00:11:05.116 real 0m6.484s 00:11:05.116 user 0m11.991s 00:11:05.116 sys 0m0.416s 00:11:05.116 11:52:18 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:05.116 11:52:18 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:11:05.116 ************************************ 00:11:05.116 END TEST bdev_verify 00:11:05.116 ************************************ 00:11:05.116 11:52:18 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:05.116 11:52:18 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:11:05.116 11:52:18 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:11:05.116 11:52:18 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:05.116 11:52:18 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:05.116 ************************************ 00:11:05.116 START TEST bdev_verify_big_io 00:11:05.116 ************************************ 00:11:05.116 11:52:18 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:11:05.376 [2024-07-15 11:52:18.718877] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:11:05.376 [2024-07-15 11:52:18.718926] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1444865 ] 00:11:05.376 [2024-07-15 11:52:18.829927] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:05.376 [2024-07-15 11:52:18.935568] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:05.376 [2024-07-15 11:52:18.935572] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:05.634 [2024-07-15 11:52:19.092359] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:05.634 [2024-07-15 11:52:19.092418] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:05.634 [2024-07-15 11:52:19.092433] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:05.634 [2024-07-15 11:52:19.100365] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:05.634 [2024-07-15 11:52:19.100392] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:05.634 [2024-07-15 11:52:19.108378] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:05.634 [2024-07-15 11:52:19.108401] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:05.634 [2024-07-15 11:52:19.180832] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:05.634 [2024-07-15 11:52:19.180885] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:05.634 [2024-07-15 11:52:19.180902] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1577ec0 00:11:05.634 [2024-07-15 11:52:19.180914] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:05.634 [2024-07-15 11:52:19.182379] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:05.634 [2024-07-15 11:52:19.182409] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:05.892 [2024-07-15 11:52:19.365431] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.366831] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.368881] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.370256] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.372272] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.373375] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.374943] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.376568] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.377624] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.379265] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.380327] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.381966] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.383018] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.384575] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.385464] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.386909] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:11:05.892 [2024-07-15 11:52:19.410939] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:11:05.892 [2024-07-15 11:52:19.412895] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:11:05.892 Running I/O for 5 seconds... 00:11:14.010 00:11:14.010 Latency(us) 00:11:14.010 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:14.010 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x100 00:11:14.010 Malloc0 : 5.99 128.13 8.01 0.00 0.00 978187.05 861.94 2582232.38 00:11:14.010 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x100 length 0x100 00:11:14.010 Malloc0 : 5.98 106.98 6.69 0.00 0.00 1166838.19 1132.63 2990721.11 00:11:14.010 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x80 00:11:14.010 Malloc1p0 : 6.44 75.81 4.74 0.00 0.00 1532119.79 2493.22 3034487.76 00:11:14.010 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x80 length 0x80 00:11:14.010 Malloc1p0 : 7.26 28.66 1.79 0.00 0.00 3941483.91 1866.35 6098153.29 00:11:14.010 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x80 00:11:14.010 Malloc1p1 : 6.76 33.13 2.07 0.00 0.00 3394504.78 1510.18 5689664.56 00:11:14.010 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x80 length 0x80 00:11:14.010 Malloc1p1 : 7.28 30.76 1.92 0.00 0.00 3601084.87 1866.35 5835553.39 00:11:14.010 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x20 00:11:14.010 Malloc2p0 : 6.44 22.36 1.40 0.00 0.00 1263589.79 648.24 2217510.29 00:11:14.010 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x20 length 0x20 00:11:14.010 Malloc2p0 : 6.61 19.36 1.21 0.00 0.00 1414057.67 780.02 2480110.19 00:11:14.010 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x20 00:11:14.010 Malloc2p1 : 6.44 22.36 1.40 0.00 0.00 1250952.77 637.55 2188332.52 00:11:14.010 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x20 length 0x20 00:11:14.010 Malloc2p1 : 6.61 19.35 1.21 0.00 0.00 1397845.22 769.34 2450932.42 00:11:14.010 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x20 00:11:14.010 Malloc2p2 : 6.44 22.35 1.40 0.00 0.00 1239230.74 630.43 2173743.64 00:11:14.010 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x20 length 0x20 00:11:14.010 Malloc2p2 : 6.62 19.35 1.21 0.00 0.00 1381834.78 762.21 2421754.66 00:11:14.010 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x20 00:11:14.010 Malloc2p3 : 6.44 22.35 1.40 0.00 0.00 1227908.22 651.80 2129976.99 00:11:14.010 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x20 length 0x20 00:11:14.010 Malloc2p3 : 6.62 19.34 1.21 0.00 0.00 1366078.73 755.09 2392576.89 00:11:14.010 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x20 00:11:14.010 Malloc2p4 : 6.44 22.34 1.40 0.00 0.00 1215446.41 626.87 2100799.22 00:11:14.010 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x20 length 0x20 00:11:14.010 Malloc2p4 : 6.62 19.34 1.21 0.00 0.00 1350220.51 737.28 2348810.24 00:11:14.010 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x20 00:11:14.010 Malloc2p5 : 6.45 22.34 1.40 0.00 0.00 1204318.45 626.87 2071621.45 00:11:14.010 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x20 length 0x20 00:11:14.010 Malloc2p5 : 6.62 19.33 1.21 0.00 0.00 1333265.88 740.84 2319632.47 00:11:14.010 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x20 00:11:14.010 Malloc2p6 : 6.45 22.34 1.40 0.00 0.00 1192110.54 648.24 2042443.69 00:11:14.010 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x20 length 0x20 00:11:14.010 Malloc2p6 : 6.82 21.12 1.32 0.00 0.00 1216517.60 769.34 2275865.82 00:11:14.010 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x20 00:11:14.010 Malloc2p7 : 6.45 22.33 1.40 0.00 0.00 1179744.08 605.50 2013265.92 00:11:14.010 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x20 length 0x20 00:11:14.010 Malloc2p7 : 6.82 21.11 1.32 0.00 0.00 1201755.78 740.84 2246688.06 00:11:14.010 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x100 00:11:14.010 TestPT : 6.85 33.30 2.08 0.00 0.00 3026910.38 113519.75 3997354.07 00:11:14.010 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x100 length 0x100 00:11:14.010 TestPT : 7.31 30.65 1.92 0.00 0.00 3209847.45 166860.35 3968176.31 00:11:14.010 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x200 00:11:14.010 raid0 : 7.00 36.55 2.28 0.00 0.00 2625459.13 1595.66 4843509.31 00:11:14.010 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x200 length 0x200 00:11:14.010 raid0 : 7.26 35.25 2.20 0.00 0.00 2689495.40 2023.07 4785153.78 00:11:14.010 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x200 00:11:14.010 concat0 : 7.07 43.02 2.69 0.00 0.00 2204039.12 1567.17 4639264.95 00:11:14.010 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x200 length 0x200 00:11:14.010 concat0 : 7.23 52.29 3.27 0.00 0.00 1756836.98 2008.82 4580909.41 00:11:14.010 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x100 00:11:14.010 raid1 : 7.06 58.91 3.68 0.00 0.00 1581672.78 2008.82 4435020.58 00:11:14.010 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x100 length 0x100 00:11:14.010 raid1 : 7.26 70.20 4.39 0.00 0.00 1250194.58 2621.44 4376665.04 00:11:14.010 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x0 length 0x4e 00:11:14.010 AIO0 : 7.07 56.05 3.50 0.00 0.00 983499.91 804.95 3194965.48 00:11:14.010 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:11:14.010 Verification LBA range: start 0x4e length 0x4e 00:11:14.010 AIO0 : 7.43 90.75 5.67 0.00 0.00 572970.11 633.99 3442976.50 00:11:14.010 =================================================================================================================== 00:11:14.010 Total : 1247.49 77.97 0.00 0.00 1620849.05 605.50 6098153.29 00:11:14.010 00:11:14.010 real 0m8.707s 00:11:14.010 user 0m16.432s 00:11:14.010 sys 0m0.419s 00:11:14.010 11:52:27 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:14.010 11:52:27 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:11:14.011 ************************************ 00:11:14.011 END TEST bdev_verify_big_io 00:11:14.011 ************************************ 00:11:14.011 11:52:27 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:14.011 11:52:27 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:14.011 11:52:27 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:11:14.011 11:52:27 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:14.011 11:52:27 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:14.011 ************************************ 00:11:14.011 START TEST bdev_write_zeroes 00:11:14.011 ************************************ 00:11:14.011 11:52:27 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:14.011 [2024-07-15 11:52:27.571573] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:11:14.011 [2024-07-15 11:52:27.571716] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1445961 ] 00:11:14.270 [2024-07-15 11:52:27.770165] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:14.529 [2024-07-15 11:52:27.875158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:14.529 [2024-07-15 11:52:28.030569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:14.529 [2024-07-15 11:52:28.030633] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:11:14.529 [2024-07-15 11:52:28.030647] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:11:14.529 [2024-07-15 11:52:28.038577] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:14.529 [2024-07-15 11:52:28.038606] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:11:14.529 [2024-07-15 11:52:28.046587] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:14.529 [2024-07-15 11:52:28.046611] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:11:14.529 [2024-07-15 11:52:28.118817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:11:14.529 [2024-07-15 11:52:28.118868] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:14.529 [2024-07-15 11:52:28.118884] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e3580 00:11:14.529 [2024-07-15 11:52:28.118896] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:14.529 [2024-07-15 11:52:28.120297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:14.529 [2024-07-15 11:52:28.120324] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:11:14.786 Running I/O for 1 seconds... 00:11:16.177 00:11:16.177 Latency(us) 00:11:16.177 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:16.177 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.177 Malloc0 : 1.05 4989.96 19.49 0.00 0.00 25643.35 641.11 42626.89 00:11:16.177 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.177 Malloc1p0 : 1.05 4982.88 19.46 0.00 0.00 25634.03 897.56 41715.09 00:11:16.177 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.177 Malloc1p1 : 1.05 4975.83 19.44 0.00 0.00 25610.34 890.43 40803.28 00:11:16.177 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 Malloc2p0 : 1.06 4968.78 19.41 0.00 0.00 25589.59 890.43 39891.48 00:11:16.178 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 Malloc2p1 : 1.06 4961.81 19.38 0.00 0.00 25568.31 894.00 38979.67 00:11:16.178 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 Malloc2p2 : 1.06 4954.87 19.35 0.00 0.00 25550.43 890.43 38295.82 00:11:16.178 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 Malloc2p3 : 1.06 4947.85 19.33 0.00 0.00 25531.44 886.87 37384.01 00:11:16.178 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 Malloc2p4 : 1.06 4940.95 19.30 0.00 0.00 25514.46 901.12 36472.21 00:11:16.178 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 Malloc2p5 : 1.06 4934.03 19.27 0.00 0.00 25493.90 894.00 35560.40 00:11:16.178 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 Malloc2p6 : 1.07 4927.09 19.25 0.00 0.00 25475.52 890.43 34648.60 00:11:16.178 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 Malloc2p7 : 1.07 4920.20 19.22 0.00 0.00 25455.34 894.00 33736.79 00:11:16.178 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 TestPT : 1.07 4913.39 19.19 0.00 0.00 25435.39 926.05 32824.99 00:11:16.178 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 raid0 : 1.07 4905.48 19.16 0.00 0.00 25407.93 1631.28 31229.33 00:11:16.178 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 concat0 : 1.07 4897.72 19.13 0.00 0.00 25357.56 1609.91 29633.67 00:11:16.178 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 raid1 : 1.07 4888.07 19.09 0.00 0.00 25296.55 2578.70 27012.23 00:11:16.178 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:11:16.178 AIO0 : 1.07 4882.18 19.07 0.00 0.00 25209.09 1054.27 26100.42 00:11:16.178 =================================================================================================================== 00:11:16.178 Total : 78991.12 308.56 0.00 0.00 25485.83 641.11 42626.89 00:11:16.178 00:11:16.178 real 0m2.306s 00:11:16.178 user 0m1.843s 00:11:16.178 sys 0m0.400s 00:11:16.178 11:52:29 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:16.178 11:52:29 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:11:16.178 ************************************ 00:11:16.178 END TEST bdev_write_zeroes 00:11:16.178 ************************************ 00:11:16.438 11:52:29 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:16.438 11:52:29 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:16.438 11:52:29 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:11:16.438 11:52:29 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:16.438 11:52:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:16.438 ************************************ 00:11:16.438 START TEST bdev_json_nonenclosed 00:11:16.438 ************************************ 00:11:16.438 11:52:29 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:16.438 [2024-07-15 11:52:29.906111] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:11:16.438 [2024-07-15 11:52:29.906171] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1446306 ] 00:11:16.699 [2024-07-15 11:52:30.035813] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:16.699 [2024-07-15 11:52:30.138406] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:16.699 [2024-07-15 11:52:30.138480] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:11:16.699 [2024-07-15 11:52:30.138501] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:11:16.699 [2024-07-15 11:52:30.138513] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:16.699 00:11:16.699 real 0m0.398s 00:11:16.699 user 0m0.236s 00:11:16.699 sys 0m0.158s 00:11:16.699 11:52:30 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:11:16.699 11:52:30 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:16.699 11:52:30 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:11:16.699 ************************************ 00:11:16.699 END TEST bdev_json_nonenclosed 00:11:16.699 ************************************ 00:11:16.699 11:52:30 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:11:16.699 11:52:30 blockdev_general -- bdev/blockdev.sh@782 -- # true 00:11:16.699 11:52:30 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:16.699 11:52:30 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:11:16.699 11:52:30 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:16.699 11:52:30 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:16.957 ************************************ 00:11:16.957 START TEST bdev_json_nonarray 00:11:16.957 ************************************ 00:11:16.957 11:52:30 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:11:16.957 [2024-07-15 11:52:30.395798] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:11:16.957 [2024-07-15 11:52:30.395863] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1446412 ] 00:11:16.957 [2024-07-15 11:52:30.526438] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:17.215 [2024-07-15 11:52:30.626732] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:17.215 [2024-07-15 11:52:30.626810] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:11:17.215 [2024-07-15 11:52:30.626881] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:11:17.215 [2024-07-15 11:52:30.626894] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:17.215 00:11:17.215 real 0m0.395s 00:11:17.215 user 0m0.240s 00:11:17.215 sys 0m0.152s 00:11:17.215 11:52:30 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:11:17.215 11:52:30 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:17.215 11:52:30 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:11:17.215 ************************************ 00:11:17.215 END TEST bdev_json_nonarray 00:11:17.215 ************************************ 00:11:17.215 11:52:30 blockdev_general -- common/autotest_common.sh@1142 -- # return 234 00:11:17.215 11:52:30 blockdev_general -- bdev/blockdev.sh@785 -- # true 00:11:17.215 11:52:30 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:11:17.215 11:52:30 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:11:17.215 11:52:30 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:17.215 11:52:30 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:17.215 11:52:30 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:17.215 ************************************ 00:11:17.215 START TEST bdev_qos 00:11:17.215 ************************************ 00:11:17.215 11:52:30 blockdev_general.bdev_qos -- common/autotest_common.sh@1123 -- # qos_test_suite '' 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=1446517 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 1446517' 00:11:17.474 Process qos testing pid: 1446517 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 1446517 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- common/autotest_common.sh@829 -- # '[' -z 1446517 ']' 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:17.474 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:17.474 11:52:30 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:17.474 [2024-07-15 11:52:30.870379] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:11:17.474 [2024-07-15 11:52:30.870443] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1446517 ] 00:11:17.474 [2024-07-15 11:52:31.003585] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:17.732 [2024-07-15 11:52:31.118473] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@862 -- # return 0 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:18.668 Malloc_0 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_0 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:18.668 [ 00:11:18.668 { 00:11:18.668 "name": "Malloc_0", 00:11:18.668 "aliases": [ 00:11:18.668 "d5547051-acff-4443-bbc3-8e17cdffb870" 00:11:18.668 ], 00:11:18.668 "product_name": "Malloc disk", 00:11:18.668 "block_size": 512, 00:11:18.668 "num_blocks": 262144, 00:11:18.668 "uuid": "d5547051-acff-4443-bbc3-8e17cdffb870", 00:11:18.668 "assigned_rate_limits": { 00:11:18.668 "rw_ios_per_sec": 0, 00:11:18.668 "rw_mbytes_per_sec": 0, 00:11:18.668 "r_mbytes_per_sec": 0, 00:11:18.668 "w_mbytes_per_sec": 0 00:11:18.668 }, 00:11:18.668 "claimed": false, 00:11:18.668 "zoned": false, 00:11:18.668 "supported_io_types": { 00:11:18.668 "read": true, 00:11:18.668 "write": true, 00:11:18.668 "unmap": true, 00:11:18.668 "flush": true, 00:11:18.668 "reset": true, 00:11:18.668 "nvme_admin": false, 00:11:18.668 "nvme_io": false, 00:11:18.668 "nvme_io_md": false, 00:11:18.668 "write_zeroes": true, 00:11:18.668 "zcopy": true, 00:11:18.668 "get_zone_info": false, 00:11:18.668 "zone_management": false, 00:11:18.668 "zone_append": false, 00:11:18.668 "compare": false, 00:11:18.668 "compare_and_write": false, 00:11:18.668 "abort": true, 00:11:18.668 "seek_hole": false, 00:11:18.668 "seek_data": false, 00:11:18.668 "copy": true, 00:11:18.668 "nvme_iov_md": false 00:11:18.668 }, 00:11:18.668 "memory_domains": [ 00:11:18.668 { 00:11:18.668 "dma_device_id": "system", 00:11:18.668 "dma_device_type": 1 00:11:18.668 }, 00:11:18.668 { 00:11:18.668 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:18.668 "dma_device_type": 2 00:11:18.668 } 00:11:18.668 ], 00:11:18.668 "driver_specific": {} 00:11:18.668 } 00:11:18.668 ] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:18.668 Null_1 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@897 -- # local bdev_name=Null_1 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local i 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:18.668 [ 00:11:18.668 { 00:11:18.668 "name": "Null_1", 00:11:18.668 "aliases": [ 00:11:18.668 "f00edbd4-c9d3-4bae-9e32-9057f93b484b" 00:11:18.668 ], 00:11:18.668 "product_name": "Null disk", 00:11:18.668 "block_size": 512, 00:11:18.668 "num_blocks": 262144, 00:11:18.668 "uuid": "f00edbd4-c9d3-4bae-9e32-9057f93b484b", 00:11:18.668 "assigned_rate_limits": { 00:11:18.668 "rw_ios_per_sec": 0, 00:11:18.668 "rw_mbytes_per_sec": 0, 00:11:18.668 "r_mbytes_per_sec": 0, 00:11:18.668 "w_mbytes_per_sec": 0 00:11:18.668 }, 00:11:18.668 "claimed": false, 00:11:18.668 "zoned": false, 00:11:18.668 "supported_io_types": { 00:11:18.668 "read": true, 00:11:18.668 "write": true, 00:11:18.668 "unmap": false, 00:11:18.668 "flush": false, 00:11:18.668 "reset": true, 00:11:18.668 "nvme_admin": false, 00:11:18.668 "nvme_io": false, 00:11:18.668 "nvme_io_md": false, 00:11:18.668 "write_zeroes": true, 00:11:18.668 "zcopy": false, 00:11:18.668 "get_zone_info": false, 00:11:18.668 "zone_management": false, 00:11:18.668 "zone_append": false, 00:11:18.668 "compare": false, 00:11:18.668 "compare_and_write": false, 00:11:18.668 "abort": true, 00:11:18.668 "seek_hole": false, 00:11:18.668 "seek_data": false, 00:11:18.668 "copy": false, 00:11:18.668 "nvme_iov_md": false 00:11:18.668 }, 00:11:18.668 "driver_specific": {} 00:11:18.668 } 00:11:18.668 ] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # return 0 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:11:18.668 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:11:18.669 11:52:32 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:11:18.927 Running I/O for 60 seconds... 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 49027.04 196108.17 0.00 0.00 197632.00 0.00 0.00 ' 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=49027.04 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 49027 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=49027 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=12000 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 12000 -gt 1000 ']' 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 12000 Malloc_0 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 12000 IOPS Malloc_0 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:24.198 11:52:37 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:24.198 ************************************ 00:11:24.198 START TEST bdev_qos_iops 00:11:24.198 ************************************ 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1123 -- # run_qos_test 12000 IOPS Malloc_0 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=12000 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:11:24.198 11:52:37 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 12003.13 48012.54 0.00 0.00 48912.00 0.00 0.00 ' 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=12003.13 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 12003 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=12003 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=10800 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=13200 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 12003 -lt 10800 ']' 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 12003 -gt 13200 ']' 00:11:29.469 00:11:29.469 real 0m5.289s 00:11:29.469 user 0m0.123s 00:11:29.469 sys 0m0.047s 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:29.469 11:52:42 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:11:29.469 ************************************ 00:11:29.469 END TEST bdev_qos_iops 00:11:29.469 ************************************ 00:11:29.469 11:52:42 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:29.469 11:52:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:11:29.469 11:52:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:29.469 11:52:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:11:29.469 11:52:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:29.469 11:52:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:29.469 11:52:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:11:29.469 11:52:42 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 15109.83 60439.33 0.00 0.00 61440.00 0.00 0.00 ' 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=61440.00 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 61440 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=61440 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=6 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 6 -lt 2 ']' 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 6 Null_1 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 6 BANDWIDTH Null_1 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:34.744 11:52:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:34.744 ************************************ 00:11:34.744 START TEST bdev_qos_bw 00:11:34.744 ************************************ 00:11:34.744 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1123 -- # run_qos_test 6 BANDWIDTH Null_1 00:11:34.744 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=6 00:11:34.744 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:34.744 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:11:34.744 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:34.744 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:11:34.745 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:34.745 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:34.745 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:11:34.745 11:52:48 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 1536.06 6144.24 0.00 0.00 6272.00 0.00 0.00 ' 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=6272.00 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 6272 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=6272 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=6144 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=5529 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=6758 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6272 -lt 5529 ']' 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 6272 -gt 6758 ']' 00:11:40.095 00:11:40.095 real 0m5.314s 00:11:40.095 user 0m0.111s 00:11:40.095 sys 0m0.058s 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:11:40.095 ************************************ 00:11:40.095 END TEST bdev_qos_bw 00:11:40.095 ************************************ 00:11:40.095 11:52:53 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:40.095 11:52:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:11:40.095 11:52:53 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:40.095 11:52:53 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:40.095 11:52:53 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:40.095 11:52:53 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:11:40.095 11:52:53 blockdev_general.bdev_qos -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:11:40.095 11:52:53 blockdev_general.bdev_qos -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:40.095 11:52:53 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:40.095 ************************************ 00:11:40.095 START TEST bdev_qos_ro_bw 00:11:40.095 ************************************ 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1123 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:11:40.095 11:52:53 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 511.50 2046.01 0.00 0.00 2052.00 0.00 0.00 ' 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2052.00 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2052 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2052 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -lt 1843 ']' 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2052 -gt 2252 ']' 00:11:45.370 00:11:45.370 real 0m5.181s 00:11:45.370 user 0m0.111s 00:11:45.370 sys 0m0.053s 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:45.370 11:52:58 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:11:45.370 ************************************ 00:11:45.370 END TEST bdev_qos_ro_bw 00:11:45.370 ************************************ 00:11:45.370 11:52:58 blockdev_general.bdev_qos -- common/autotest_common.sh@1142 -- # return 0 00:11:45.370 11:52:58 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:11:45.370 11:52:58 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:45.370 11:52:58 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:45.937 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:45.937 11:52:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:11:45.937 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:45.937 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:46.196 00:11:46.196 Latency(us) 00:11:46.196 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:46.196 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:46.196 Malloc_0 : 26.88 16378.17 63.98 0.00 0.00 15483.69 2521.71 503316.48 00:11:46.196 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:46.196 Null_1 : 27.08 15613.21 60.99 0.00 0.00 16337.03 1025.78 197861.73 00:11:46.196 =================================================================================================================== 00:11:46.196 Total : 31991.37 124.97 0.00 0.00 15901.75 1025.78 503316.48 00:11:46.196 0 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 1446517 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@948 -- # '[' -z 1446517 ']' 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@952 -- # kill -0 1446517 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # uname 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1446517 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1446517' 00:11:46.196 killing process with pid 1446517 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # kill 1446517 00:11:46.196 Received shutdown signal, test time was about 27.143093 seconds 00:11:46.196 00:11:46.196 Latency(us) 00:11:46.196 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:46.196 =================================================================================================================== 00:11:46.196 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:46.196 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@972 -- # wait 1446517 00:11:46.455 11:52:59 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:11:46.455 00:11:46.455 real 0m29.093s 00:11:46.455 user 0m30.141s 00:11:46.455 sys 0m0.997s 00:11:46.455 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:46.455 11:52:59 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:11:46.455 ************************************ 00:11:46.455 END TEST bdev_qos 00:11:46.455 ************************************ 00:11:46.455 11:52:59 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:46.455 11:52:59 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:11:46.455 11:52:59 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:46.455 11:52:59 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:46.455 11:52:59 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:46.455 ************************************ 00:11:46.455 START TEST bdev_qd_sampling 00:11:46.455 ************************************ 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1123 -- # qd_sampling_test_suite '' 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=1450307 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 1450307' 00:11:46.455 Process bdev QD sampling period testing pid: 1450307 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 1450307 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@829 -- # '[' -z 1450307 ']' 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:46.455 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:46.455 11:52:59 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:46.713 [2024-07-15 11:53:00.051948] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:11:46.713 [2024-07-15 11:53:00.052018] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1450307 ] 00:11:46.713 [2024-07-15 11:53:00.180713] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:46.713 [2024-07-15 11:53:00.287997] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:46.713 [2024-07-15 11:53:00.288002] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@862 -- # return 0 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:47.648 Malloc_QD 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_QD 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local i 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:47.648 [ 00:11:47.648 { 00:11:47.648 "name": "Malloc_QD", 00:11:47.648 "aliases": [ 00:11:47.648 "036c1937-dba7-4747-bdbc-f0d301b00407" 00:11:47.648 ], 00:11:47.648 "product_name": "Malloc disk", 00:11:47.648 "block_size": 512, 00:11:47.648 "num_blocks": 262144, 00:11:47.648 "uuid": "036c1937-dba7-4747-bdbc-f0d301b00407", 00:11:47.648 "assigned_rate_limits": { 00:11:47.648 "rw_ios_per_sec": 0, 00:11:47.648 "rw_mbytes_per_sec": 0, 00:11:47.648 "r_mbytes_per_sec": 0, 00:11:47.648 "w_mbytes_per_sec": 0 00:11:47.648 }, 00:11:47.648 "claimed": false, 00:11:47.648 "zoned": false, 00:11:47.648 "supported_io_types": { 00:11:47.648 "read": true, 00:11:47.648 "write": true, 00:11:47.648 "unmap": true, 00:11:47.648 "flush": true, 00:11:47.648 "reset": true, 00:11:47.648 "nvme_admin": false, 00:11:47.648 "nvme_io": false, 00:11:47.648 "nvme_io_md": false, 00:11:47.648 "write_zeroes": true, 00:11:47.648 "zcopy": true, 00:11:47.648 "get_zone_info": false, 00:11:47.648 "zone_management": false, 00:11:47.648 "zone_append": false, 00:11:47.648 "compare": false, 00:11:47.648 "compare_and_write": false, 00:11:47.648 "abort": true, 00:11:47.648 "seek_hole": false, 00:11:47.648 "seek_data": false, 00:11:47.648 "copy": true, 00:11:47.648 "nvme_iov_md": false 00:11:47.648 }, 00:11:47.648 "memory_domains": [ 00:11:47.648 { 00:11:47.648 "dma_device_id": "system", 00:11:47.648 "dma_device_type": 1 00:11:47.648 }, 00:11:47.648 { 00:11:47.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:47.648 "dma_device_type": 2 00:11:47.648 } 00:11:47.648 ], 00:11:47.648 "driver_specific": {} 00:11:47.648 } 00:11:47.648 ] 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # return 0 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:11:47.648 11:53:01 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:11:47.906 Running I/O for 5 seconds... 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:11:49.809 "tick_rate": 2300000000, 00:11:49.809 "ticks": 11124676225437662, 00:11:49.809 "bdevs": [ 00:11:49.809 { 00:11:49.809 "name": "Malloc_QD", 00:11:49.809 "bytes_read": 673231360, 00:11:49.809 "num_read_ops": 164356, 00:11:49.809 "bytes_written": 0, 00:11:49.809 "num_write_ops": 0, 00:11:49.809 "bytes_unmapped": 0, 00:11:49.809 "num_unmap_ops": 0, 00:11:49.809 "bytes_copied": 0, 00:11:49.809 "num_copy_ops": 0, 00:11:49.809 "read_latency_ticks": 2115274824228, 00:11:49.809 "max_read_latency_ticks": 14963552, 00:11:49.809 "min_read_latency_ticks": 243376, 00:11:49.809 "write_latency_ticks": 0, 00:11:49.809 "max_write_latency_ticks": 0, 00:11:49.809 "min_write_latency_ticks": 0, 00:11:49.809 "unmap_latency_ticks": 0, 00:11:49.809 "max_unmap_latency_ticks": 0, 00:11:49.809 "min_unmap_latency_ticks": 0, 00:11:49.809 "copy_latency_ticks": 0, 00:11:49.809 "max_copy_latency_ticks": 0, 00:11:49.809 "min_copy_latency_ticks": 0, 00:11:49.809 "io_error": {}, 00:11:49.809 "queue_depth_polling_period": 10, 00:11:49.809 "queue_depth": 512, 00:11:49.809 "io_time": 30, 00:11:49.809 "weighted_io_time": 15360 00:11:49.809 } 00:11:49.809 ] 00:11:49.809 }' 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:49.809 00:11:49.809 Latency(us) 00:11:49.809 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:49.809 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:11:49.809 Malloc_QD : 1.89 50104.54 195.72 0.00 0.00 5096.50 1381.95 5470.83 00:11:49.809 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:11:49.809 Malloc_QD : 1.89 41251.34 161.14 0.00 0.00 6187.44 1168.25 6525.11 00:11:49.809 =================================================================================================================== 00:11:49.809 Total : 91355.89 356.86 0.00 0.00 5589.29 1168.25 6525.11 00:11:49.809 0 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:49.809 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 1450307 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@948 -- # '[' -z 1450307 ']' 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@952 -- # kill -0 1450307 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # uname 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1450307 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1450307' 00:11:49.810 killing process with pid 1450307 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # kill 1450307 00:11:49.810 Received shutdown signal, test time was about 1.964673 seconds 00:11:49.810 00:11:49.810 Latency(us) 00:11:49.810 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:49.810 =================================================================================================================== 00:11:49.810 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:49.810 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@972 -- # wait 1450307 00:11:50.068 11:53:03 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:11:50.068 00:11:50.068 real 0m3.502s 00:11:50.068 user 0m6.952s 00:11:50.068 sys 0m0.474s 00:11:50.068 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:50.068 11:53:03 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:11:50.068 ************************************ 00:11:50.068 END TEST bdev_qd_sampling 00:11:50.068 ************************************ 00:11:50.068 11:53:03 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:50.068 11:53:03 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:11:50.068 11:53:03 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:50.068 11:53:03 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:50.068 11:53:03 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:50.068 ************************************ 00:11:50.068 START TEST bdev_error 00:11:50.068 ************************************ 00:11:50.068 11:53:03 blockdev_general.bdev_error -- common/autotest_common.sh@1123 -- # error_test_suite '' 00:11:50.068 11:53:03 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:11:50.068 11:53:03 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:11:50.068 11:53:03 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:11:50.068 11:53:03 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=1450857 00:11:50.068 11:53:03 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 1450857' 00:11:50.068 Process error testing pid: 1450857 00:11:50.068 11:53:03 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:11:50.068 11:53:03 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 1450857 00:11:50.068 11:53:03 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1450857 ']' 00:11:50.068 11:53:03 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:50.068 11:53:03 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:50.068 11:53:03 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:50.068 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:50.068 11:53:03 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:50.068 11:53:03 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:50.068 [2024-07-15 11:53:03.646299] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:11:50.068 [2024-07-15 11:53:03.646373] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1450857 ] 00:11:50.326 [2024-07-15 11:53:03.783472] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:50.326 [2024-07-15 11:53:03.916054] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:51.264 11:53:04 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:51.264 Dev_1 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:51.264 [ 00:11:51.264 { 00:11:51.264 "name": "Dev_1", 00:11:51.264 "aliases": [ 00:11:51.264 "5825f6d2-365c-40d0-bd7e-36ed24d91b85" 00:11:51.264 ], 00:11:51.264 "product_name": "Malloc disk", 00:11:51.264 "block_size": 512, 00:11:51.264 "num_blocks": 262144, 00:11:51.264 "uuid": "5825f6d2-365c-40d0-bd7e-36ed24d91b85", 00:11:51.264 "assigned_rate_limits": { 00:11:51.264 "rw_ios_per_sec": 0, 00:11:51.264 "rw_mbytes_per_sec": 0, 00:11:51.264 "r_mbytes_per_sec": 0, 00:11:51.264 "w_mbytes_per_sec": 0 00:11:51.264 }, 00:11:51.264 "claimed": false, 00:11:51.264 "zoned": false, 00:11:51.264 "supported_io_types": { 00:11:51.264 "read": true, 00:11:51.264 "write": true, 00:11:51.264 "unmap": true, 00:11:51.264 "flush": true, 00:11:51.264 "reset": true, 00:11:51.264 "nvme_admin": false, 00:11:51.264 "nvme_io": false, 00:11:51.264 "nvme_io_md": false, 00:11:51.264 "write_zeroes": true, 00:11:51.264 "zcopy": true, 00:11:51.264 "get_zone_info": false, 00:11:51.264 "zone_management": false, 00:11:51.264 "zone_append": false, 00:11:51.264 "compare": false, 00:11:51.264 "compare_and_write": false, 00:11:51.264 "abort": true, 00:11:51.264 "seek_hole": false, 00:11:51.264 "seek_data": false, 00:11:51.264 "copy": true, 00:11:51.264 "nvme_iov_md": false 00:11:51.264 }, 00:11:51.264 "memory_domains": [ 00:11:51.264 { 00:11:51.264 "dma_device_id": "system", 00:11:51.264 "dma_device_type": 1 00:11:51.264 }, 00:11:51.264 { 00:11:51.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.264 "dma_device_type": 2 00:11:51.264 } 00:11:51.264 ], 00:11:51.264 "driver_specific": {} 00:11:51.264 } 00:11:51.264 ] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:51.264 11:53:04 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:51.264 true 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:51.264 Dev_2 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:51.264 [ 00:11:51.264 { 00:11:51.264 "name": "Dev_2", 00:11:51.264 "aliases": [ 00:11:51.264 "e35ff650-bef4-4aeb-828c-fa8a6144939f" 00:11:51.264 ], 00:11:51.264 "product_name": "Malloc disk", 00:11:51.264 "block_size": 512, 00:11:51.264 "num_blocks": 262144, 00:11:51.264 "uuid": "e35ff650-bef4-4aeb-828c-fa8a6144939f", 00:11:51.264 "assigned_rate_limits": { 00:11:51.264 "rw_ios_per_sec": 0, 00:11:51.264 "rw_mbytes_per_sec": 0, 00:11:51.264 "r_mbytes_per_sec": 0, 00:11:51.264 "w_mbytes_per_sec": 0 00:11:51.264 }, 00:11:51.264 "claimed": false, 00:11:51.264 "zoned": false, 00:11:51.264 "supported_io_types": { 00:11:51.264 "read": true, 00:11:51.264 "write": true, 00:11:51.264 "unmap": true, 00:11:51.264 "flush": true, 00:11:51.264 "reset": true, 00:11:51.264 "nvme_admin": false, 00:11:51.264 "nvme_io": false, 00:11:51.264 "nvme_io_md": false, 00:11:51.264 "write_zeroes": true, 00:11:51.264 "zcopy": true, 00:11:51.264 "get_zone_info": false, 00:11:51.264 "zone_management": false, 00:11:51.264 "zone_append": false, 00:11:51.264 "compare": false, 00:11:51.264 "compare_and_write": false, 00:11:51.264 "abort": true, 00:11:51.264 "seek_hole": false, 00:11:51.264 "seek_data": false, 00:11:51.264 "copy": true, 00:11:51.264 "nvme_iov_md": false 00:11:51.264 }, 00:11:51.264 "memory_domains": [ 00:11:51.264 { 00:11:51.264 "dma_device_id": "system", 00:11:51.264 "dma_device_type": 1 00:11:51.264 }, 00:11:51.264 { 00:11:51.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:51.264 "dma_device_type": 2 00:11:51.264 } 00:11:51.264 ], 00:11:51.264 "driver_specific": {} 00:11:51.264 } 00:11:51.264 ] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:51.264 11:53:04 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:51.264 11:53:04 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:51.264 11:53:04 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:11:51.264 11:53:04 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:51.523 Running I/O for 5 seconds... 00:11:52.460 11:53:05 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 1450857 00:11:52.460 11:53:05 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 1450857' 00:11:52.460 Process is existed as continue on error is set. Pid: 1450857 00:11:52.460 11:53:05 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:11:52.460 11:53:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:52.460 11:53:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:52.460 11:53:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:52.460 11:53:05 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:11:52.460 11:53:05 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:52.460 11:53:05 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:52.460 11:53:05 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:52.460 11:53:05 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:11:52.460 Timeout while waiting for response: 00:11:52.460 00:11:52.460 00:11:56.658 00:11:56.658 Latency(us) 00:11:56.658 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:56.658 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:56.658 EE_Dev_1 : 0.90 29299.30 114.45 5.58 0.00 541.66 164.73 851.26 00:11:56.658 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:56.658 Dev_2 : 5.00 63409.41 247.69 0.00 0.00 247.85 84.15 27468.13 00:11:56.658 =================================================================================================================== 00:11:56.658 Total : 92708.71 362.14 5.58 0.00 270.31 84.15 27468.13 00:11:57.595 11:53:10 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 1450857 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@948 -- # '[' -z 1450857 ']' 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@952 -- # kill -0 1450857 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # uname 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1450857 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1450857' 00:11:57.595 killing process with pid 1450857 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # kill 1450857 00:11:57.595 Received shutdown signal, test time was about 5.000000 seconds 00:11:57.595 00:11:57.595 Latency(us) 00:11:57.595 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:57.595 =================================================================================================================== 00:11:57.595 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:11:57.595 11:53:10 blockdev_general.bdev_error -- common/autotest_common.sh@972 -- # wait 1450857 00:11:57.854 11:53:11 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=1451760 00:11:57.854 11:53:11 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 1451760' 00:11:57.854 Process error testing pid: 1451760 00:11:57.854 11:53:11 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:11:57.854 11:53:11 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 1451760 00:11:57.854 11:53:11 blockdev_general.bdev_error -- common/autotest_common.sh@829 -- # '[' -z 1451760 ']' 00:11:57.854 11:53:11 blockdev_general.bdev_error -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:57.854 11:53:11 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:57.854 11:53:11 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:57.854 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:57.854 11:53:11 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:57.854 11:53:11 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:57.854 [2024-07-15 11:53:11.315765] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:11:57.854 [2024-07-15 11:53:11.315843] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1451760 ] 00:11:58.112 [2024-07-15 11:53:11.452115] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:58.112 [2024-07-15 11:53:11.578024] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:58.680 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:11:58.680 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@862 -- # return 0 00:11:58.680 11:53:12 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:11:58.680 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:58.680 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:58.940 Dev_1 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_1 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:58.940 [ 00:11:58.940 { 00:11:58.940 "name": "Dev_1", 00:11:58.940 "aliases": [ 00:11:58.940 "deb09ec9-593d-43fb-865c-341503367b83" 00:11:58.940 ], 00:11:58.940 "product_name": "Malloc disk", 00:11:58.940 "block_size": 512, 00:11:58.940 "num_blocks": 262144, 00:11:58.940 "uuid": "deb09ec9-593d-43fb-865c-341503367b83", 00:11:58.940 "assigned_rate_limits": { 00:11:58.940 "rw_ios_per_sec": 0, 00:11:58.940 "rw_mbytes_per_sec": 0, 00:11:58.940 "r_mbytes_per_sec": 0, 00:11:58.940 "w_mbytes_per_sec": 0 00:11:58.940 }, 00:11:58.940 "claimed": false, 00:11:58.940 "zoned": false, 00:11:58.940 "supported_io_types": { 00:11:58.940 "read": true, 00:11:58.940 "write": true, 00:11:58.940 "unmap": true, 00:11:58.940 "flush": true, 00:11:58.940 "reset": true, 00:11:58.940 "nvme_admin": false, 00:11:58.940 "nvme_io": false, 00:11:58.940 "nvme_io_md": false, 00:11:58.940 "write_zeroes": true, 00:11:58.940 "zcopy": true, 00:11:58.940 "get_zone_info": false, 00:11:58.940 "zone_management": false, 00:11:58.940 "zone_append": false, 00:11:58.940 "compare": false, 00:11:58.940 "compare_and_write": false, 00:11:58.940 "abort": true, 00:11:58.940 "seek_hole": false, 00:11:58.940 "seek_data": false, 00:11:58.940 "copy": true, 00:11:58.940 "nvme_iov_md": false 00:11:58.940 }, 00:11:58.940 "memory_domains": [ 00:11:58.940 { 00:11:58.940 "dma_device_id": "system", 00:11:58.940 "dma_device_type": 1 00:11:58.940 }, 00:11:58.940 { 00:11:58.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.940 "dma_device_type": 2 00:11:58.940 } 00:11:58.940 ], 00:11:58.940 "driver_specific": {} 00:11:58.940 } 00:11:58.940 ] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:58.940 11:53:12 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:58.940 true 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:58.940 Dev_2 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@897 -- # local bdev_name=Dev_2 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local i 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:58.940 [ 00:11:58.940 { 00:11:58.940 "name": "Dev_2", 00:11:58.940 "aliases": [ 00:11:58.940 "e1fadc49-50ad-406c-a3d2-8f9f8f5ec66d" 00:11:58.940 ], 00:11:58.940 "product_name": "Malloc disk", 00:11:58.940 "block_size": 512, 00:11:58.940 "num_blocks": 262144, 00:11:58.940 "uuid": "e1fadc49-50ad-406c-a3d2-8f9f8f5ec66d", 00:11:58.940 "assigned_rate_limits": { 00:11:58.940 "rw_ios_per_sec": 0, 00:11:58.940 "rw_mbytes_per_sec": 0, 00:11:58.940 "r_mbytes_per_sec": 0, 00:11:58.940 "w_mbytes_per_sec": 0 00:11:58.940 }, 00:11:58.940 "claimed": false, 00:11:58.940 "zoned": false, 00:11:58.940 "supported_io_types": { 00:11:58.940 "read": true, 00:11:58.940 "write": true, 00:11:58.940 "unmap": true, 00:11:58.940 "flush": true, 00:11:58.940 "reset": true, 00:11:58.940 "nvme_admin": false, 00:11:58.940 "nvme_io": false, 00:11:58.940 "nvme_io_md": false, 00:11:58.940 "write_zeroes": true, 00:11:58.940 "zcopy": true, 00:11:58.940 "get_zone_info": false, 00:11:58.940 "zone_management": false, 00:11:58.940 "zone_append": false, 00:11:58.940 "compare": false, 00:11:58.940 "compare_and_write": false, 00:11:58.940 "abort": true, 00:11:58.940 "seek_hole": false, 00:11:58.940 "seek_data": false, 00:11:58.940 "copy": true, 00:11:58.940 "nvme_iov_md": false 00:11:58.940 }, 00:11:58.940 "memory_domains": [ 00:11:58.940 { 00:11:58.940 "dma_device_id": "system", 00:11:58.940 "dma_device_type": 1 00:11:58.940 }, 00:11:58.940 { 00:11:58.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:58.940 "dma_device_type": 2 00:11:58.940 } 00:11:58.940 ], 00:11:58.940 "driver_specific": {} 00:11:58.940 } 00:11:58.940 ] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # return 0 00:11:58.940 11:53:12 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@559 -- # xtrace_disable 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:58.940 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:11:58.941 11:53:12 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 1451760 00:11:58.941 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@648 -- # local es=0 00:11:58.941 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # valid_exec_arg wait 1451760 00:11:58.941 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@636 -- # local arg=wait 00:11:58.941 11:53:12 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:11:58.941 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:58.941 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # type -t wait 00:11:58.941 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:11:58.941 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # wait 1451760 00:11:58.941 Running I/O for 5 seconds... 00:11:59.200 task offset: 159304 on job bdev=EE_Dev_1 fails 00:11:59.200 00:11:59.200 Latency(us) 00:11:59.200 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:11:59.200 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:59.200 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:11:59.200 EE_Dev_1 : 0.00 23913.04 93.41 5434.78 0.00 454.17 162.06 804.95 00:11:59.200 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:11:59.200 Dev_2 : 0.00 14492.75 56.61 0.00 0.00 827.33 157.61 1538.67 00:11:59.200 =================================================================================================================== 00:11:59.200 Total : 38405.80 150.02 5434.78 0.00 656.56 157.61 1538.67 00:11:59.200 [2024-07-15 11:53:12.541560] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:11:59.200 request: 00:11:59.200 { 00:11:59.200 "method": "perform_tests", 00:11:59.200 "req_id": 1 00:11:59.200 } 00:11:59.200 Got JSON-RPC error response 00:11:59.200 response: 00:11:59.200 { 00:11:59.200 "code": -32603, 00:11:59.200 "message": "bdevperf failed with error Operation not permitted" 00:11:59.200 } 00:11:59.460 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # es=255 00:11:59.460 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:11:59.460 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # es=127 00:11:59.460 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # case "$es" in 00:11:59.460 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@668 -- # es=1 00:11:59.460 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:11:59.460 00:11:59.460 real 0m9.340s 00:11:59.460 user 0m9.631s 00:11:59.460 sys 0m1.020s 00:11:59.460 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # xtrace_disable 00:11:59.460 11:53:12 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:11:59.460 ************************************ 00:11:59.460 END TEST bdev_error 00:11:59.460 ************************************ 00:11:59.460 11:53:12 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:11:59.460 11:53:12 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:11:59.460 11:53:12 blockdev_general -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:11:59.460 11:53:12 blockdev_general -- common/autotest_common.sh@1105 -- # xtrace_disable 00:11:59.460 11:53:12 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:11:59.460 ************************************ 00:11:59.460 START TEST bdev_stat 00:11:59.460 ************************************ 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- common/autotest_common.sh@1123 -- # stat_test_suite '' 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=1452105 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 1452105' 00:11:59.460 Process Bdev IO statistics testing pid: 1452105 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 1452105 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- common/autotest_common.sh@829 -- # '[' -z 1452105 ']' 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local max_retries=100 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:11:59.460 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # xtrace_disable 00:11:59.460 11:53:13 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:11:59.720 [2024-07-15 11:53:13.076705] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:11:59.720 [2024-07-15 11:53:13.076778] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1452105 ] 00:11:59.720 [2024-07-15 11:53:13.204200] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:11:59.720 [2024-07-15 11:53:13.315839] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:11:59.720 [2024-07-15 11:53:13.315846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:00.657 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@862 -- # return 0 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:00.658 Malloc_STAT 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@897 -- # local bdev_name=Malloc_STAT 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local i 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:00.658 [ 00:12:00.658 { 00:12:00.658 "name": "Malloc_STAT", 00:12:00.658 "aliases": [ 00:12:00.658 "dcb108ad-deb6-40fb-8d38-f2c64cf81342" 00:12:00.658 ], 00:12:00.658 "product_name": "Malloc disk", 00:12:00.658 "block_size": 512, 00:12:00.658 "num_blocks": 262144, 00:12:00.658 "uuid": "dcb108ad-deb6-40fb-8d38-f2c64cf81342", 00:12:00.658 "assigned_rate_limits": { 00:12:00.658 "rw_ios_per_sec": 0, 00:12:00.658 "rw_mbytes_per_sec": 0, 00:12:00.658 "r_mbytes_per_sec": 0, 00:12:00.658 "w_mbytes_per_sec": 0 00:12:00.658 }, 00:12:00.658 "claimed": false, 00:12:00.658 "zoned": false, 00:12:00.658 "supported_io_types": { 00:12:00.658 "read": true, 00:12:00.658 "write": true, 00:12:00.658 "unmap": true, 00:12:00.658 "flush": true, 00:12:00.658 "reset": true, 00:12:00.658 "nvme_admin": false, 00:12:00.658 "nvme_io": false, 00:12:00.658 "nvme_io_md": false, 00:12:00.658 "write_zeroes": true, 00:12:00.658 "zcopy": true, 00:12:00.658 "get_zone_info": false, 00:12:00.658 "zone_management": false, 00:12:00.658 "zone_append": false, 00:12:00.658 "compare": false, 00:12:00.658 "compare_and_write": false, 00:12:00.658 "abort": true, 00:12:00.658 "seek_hole": false, 00:12:00.658 "seek_data": false, 00:12:00.658 "copy": true, 00:12:00.658 "nvme_iov_md": false 00:12:00.658 }, 00:12:00.658 "memory_domains": [ 00:12:00.658 { 00:12:00.658 "dma_device_id": "system", 00:12:00.658 "dma_device_type": 1 00:12:00.658 }, 00:12:00.658 { 00:12:00.658 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.658 "dma_device_type": 2 00:12:00.658 } 00:12:00.658 ], 00:12:00.658 "driver_specific": {} 00:12:00.658 } 00:12:00.658 ] 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # return 0 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:12:00.658 11:53:14 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:12:00.658 Running I/O for 10 seconds... 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:12:02.560 "tick_rate": 2300000000, 00:12:02.560 "ticks": 11124706068778704, 00:12:02.560 "bdevs": [ 00:12:02.560 { 00:12:02.560 "name": "Malloc_STAT", 00:12:02.560 "bytes_read": 707834368, 00:12:02.560 "num_read_ops": 172804, 00:12:02.560 "bytes_written": 0, 00:12:02.560 "num_write_ops": 0, 00:12:02.560 "bytes_unmapped": 0, 00:12:02.560 "num_unmap_ops": 0, 00:12:02.560 "bytes_copied": 0, 00:12:02.560 "num_copy_ops": 0, 00:12:02.560 "read_latency_ticks": 2227026400752, 00:12:02.560 "max_read_latency_ticks": 17291056, 00:12:02.560 "min_read_latency_ticks": 241510, 00:12:02.560 "write_latency_ticks": 0, 00:12:02.560 "max_write_latency_ticks": 0, 00:12:02.560 "min_write_latency_ticks": 0, 00:12:02.560 "unmap_latency_ticks": 0, 00:12:02.560 "max_unmap_latency_ticks": 0, 00:12:02.560 "min_unmap_latency_ticks": 0, 00:12:02.560 "copy_latency_ticks": 0, 00:12:02.560 "max_copy_latency_ticks": 0, 00:12:02.560 "min_copy_latency_ticks": 0, 00:12:02.560 "io_error": {} 00:12:02.560 } 00:12:02.560 ] 00:12:02.560 }' 00:12:02.560 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=172804 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:12:02.819 "tick_rate": 2300000000, 00:12:02.819 "ticks": 11124706233853920, 00:12:02.819 "name": "Malloc_STAT", 00:12:02.819 "channels": [ 00:12:02.819 { 00:12:02.819 "thread_id": 2, 00:12:02.819 "bytes_read": 405798912, 00:12:02.819 "num_read_ops": 99072, 00:12:02.819 "bytes_written": 0, 00:12:02.819 "num_write_ops": 0, 00:12:02.819 "bytes_unmapped": 0, 00:12:02.819 "num_unmap_ops": 0, 00:12:02.819 "bytes_copied": 0, 00:12:02.819 "num_copy_ops": 0, 00:12:02.819 "read_latency_ticks": 1155571538130, 00:12:02.819 "max_read_latency_ticks": 12718678, 00:12:02.819 "min_read_latency_ticks": 8250940, 00:12:02.819 "write_latency_ticks": 0, 00:12:02.819 "max_write_latency_ticks": 0, 00:12:02.819 "min_write_latency_ticks": 0, 00:12:02.819 "unmap_latency_ticks": 0, 00:12:02.819 "max_unmap_latency_ticks": 0, 00:12:02.819 "min_unmap_latency_ticks": 0, 00:12:02.819 "copy_latency_ticks": 0, 00:12:02.819 "max_copy_latency_ticks": 0, 00:12:02.819 "min_copy_latency_ticks": 0 00:12:02.819 }, 00:12:02.819 { 00:12:02.819 "thread_id": 3, 00:12:02.819 "bytes_read": 329252864, 00:12:02.819 "num_read_ops": 80384, 00:12:02.819 "bytes_written": 0, 00:12:02.819 "num_write_ops": 0, 00:12:02.819 "bytes_unmapped": 0, 00:12:02.819 "num_unmap_ops": 0, 00:12:02.819 "bytes_copied": 0, 00:12:02.819 "num_copy_ops": 0, 00:12:02.819 "read_latency_ticks": 1157636509268, 00:12:02.819 "max_read_latency_ticks": 17291056, 00:12:02.819 "min_read_latency_ticks": 9477744, 00:12:02.819 "write_latency_ticks": 0, 00:12:02.819 "max_write_latency_ticks": 0, 00:12:02.819 "min_write_latency_ticks": 0, 00:12:02.819 "unmap_latency_ticks": 0, 00:12:02.819 "max_unmap_latency_ticks": 0, 00:12:02.819 "min_unmap_latency_ticks": 0, 00:12:02.819 "copy_latency_ticks": 0, 00:12:02.819 "max_copy_latency_ticks": 0, 00:12:02.819 "min_copy_latency_ticks": 0 00:12:02.819 } 00:12:02.819 ] 00:12:02.819 }' 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=99072 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=99072 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=80384 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=179456 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:12:02.819 "tick_rate": 2300000000, 00:12:02.819 "ticks": 11124706514474414, 00:12:02.819 "bdevs": [ 00:12:02.819 { 00:12:02.819 "name": "Malloc_STAT", 00:12:02.819 "bytes_read": 780186112, 00:12:02.819 "num_read_ops": 190468, 00:12:02.819 "bytes_written": 0, 00:12:02.819 "num_write_ops": 0, 00:12:02.819 "bytes_unmapped": 0, 00:12:02.819 "num_unmap_ops": 0, 00:12:02.819 "bytes_copied": 0, 00:12:02.819 "num_copy_ops": 0, 00:12:02.819 "read_latency_ticks": 2454555063048, 00:12:02.819 "max_read_latency_ticks": 17291056, 00:12:02.819 "min_read_latency_ticks": 241510, 00:12:02.819 "write_latency_ticks": 0, 00:12:02.819 "max_write_latency_ticks": 0, 00:12:02.819 "min_write_latency_ticks": 0, 00:12:02.819 "unmap_latency_ticks": 0, 00:12:02.819 "max_unmap_latency_ticks": 0, 00:12:02.819 "min_unmap_latency_ticks": 0, 00:12:02.819 "copy_latency_ticks": 0, 00:12:02.819 "max_copy_latency_ticks": 0, 00:12:02.819 "min_copy_latency_ticks": 0, 00:12:02.819 "io_error": {} 00:12:02.819 } 00:12:02.819 ] 00:12:02.819 }' 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=190468 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 179456 -lt 172804 ']' 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 179456 -gt 190468 ']' 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@559 -- # xtrace_disable 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:02.819 00:12:02.819 Latency(us) 00:12:02.819 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:02.819 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:12:02.819 Malloc_STAT : 2.17 50412.34 196.92 0.00 0.00 5066.13 1389.08 5556.31 00:12:02.819 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:12:02.819 Malloc_STAT : 2.17 40827.07 159.48 0.00 0.00 6254.76 1182.50 7522.39 00:12:02.819 =================================================================================================================== 00:12:02.819 Total : 91239.41 356.40 0.00 0.00 5598.17 1182.50 7522.39 00:12:02.819 0 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 1452105 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@948 -- # '[' -z 1452105 ']' 00:12:02.819 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@952 -- # kill -0 1452105 00:12:02.820 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # uname 00:12:02.820 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:03.078 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1452105 00:12:03.078 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:03.078 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:03.078 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1452105' 00:12:03.078 killing process with pid 1452105 00:12:03.078 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # kill 1452105 00:12:03.078 Received shutdown signal, test time was about 2.243920 seconds 00:12:03.078 00:12:03.078 Latency(us) 00:12:03.078 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:12:03.078 =================================================================================================================== 00:12:03.078 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:12:03.078 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@972 -- # wait 1452105 00:12:03.078 11:53:16 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:12:03.078 00:12:03.078 real 0m3.651s 00:12:03.078 user 0m7.301s 00:12:03.078 sys 0m0.465s 00:12:03.078 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:03.078 11:53:16 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:12:03.078 ************************************ 00:12:03.078 END TEST bdev_stat 00:12:03.078 ************************************ 00:12:03.338 11:53:16 blockdev_general -- common/autotest_common.sh@1142 -- # return 0 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:12:03.338 11:53:16 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:12:03.338 00:12:03.338 real 1m59.819s 00:12:03.338 user 7m16.967s 00:12:03.338 sys 0m23.563s 00:12:03.338 11:53:16 blockdev_general -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:03.338 11:53:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:12:03.338 ************************************ 00:12:03.338 END TEST blockdev_general 00:12:03.338 ************************************ 00:12:03.338 11:53:16 -- common/autotest_common.sh@1142 -- # return 0 00:12:03.338 11:53:16 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:12:03.338 11:53:16 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:03.338 11:53:16 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:03.338 11:53:16 -- common/autotest_common.sh@10 -- # set +x 00:12:03.338 ************************************ 00:12:03.338 START TEST bdev_raid 00:12:03.338 ************************************ 00:12:03.338 11:53:16 bdev_raid -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:12:03.338 * Looking for test storage... 00:12:03.338 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:12:03.338 11:53:16 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:12:03.338 11:53:16 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:12:03.338 11:53:16 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:12:03.597 11:53:16 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:12:03.597 11:53:16 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:12:03.597 11:53:16 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:12:03.597 11:53:16 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:12:03.597 11:53:16 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:12:03.597 11:53:16 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:12:03.597 11:53:16 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:12:03.597 11:53:16 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:12:03.597 11:53:16 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:12:03.597 11:53:16 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:12:03.597 11:53:16 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:03.597 11:53:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:03.597 ************************************ 00:12:03.597 START TEST raid_function_test_raid0 00:12:03.597 ************************************ 00:12:03.597 11:53:16 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1123 -- # raid_function_test raid0 00:12:03.597 11:53:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:12:03.597 11:53:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:12:03.597 11:53:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:12:03.597 11:53:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=1452660 00:12:03.597 11:53:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1452660' 00:12:03.597 Process raid pid: 1452660 00:12:03.597 11:53:16 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:03.597 11:53:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 1452660 /var/tmp/spdk-raid.sock 00:12:03.598 11:53:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@829 -- # '[' -z 1452660 ']' 00:12:03.598 11:53:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:03.598 11:53:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:03.598 11:53:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:03.598 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:03.598 11:53:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:03.598 11:53:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:12:03.598 [2024-07-15 11:53:17.060898] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:12:03.598 [2024-07-15 11:53:17.060968] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:03.598 [2024-07-15 11:53:17.192153] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:03.856 [2024-07-15 11:53:17.300524] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:03.856 [2024-07-15 11:53:17.366143] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:03.856 [2024-07-15 11:53:17.366178] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:04.422 11:53:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:04.422 11:53:17 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@862 -- # return 0 00:12:04.422 11:53:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:12:04.422 11:53:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:12:04.422 11:53:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:04.422 11:53:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:12:04.422 11:53:17 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:12:04.680 [2024-07-15 11:53:18.254295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:04.680 [2024-07-15 11:53:18.255765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:04.680 [2024-07-15 11:53:18.255822] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe75d20 00:12:04.681 [2024-07-15 11:53:18.255832] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:04.681 [2024-07-15 11:53:18.256013] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe755d0 00:12:04.681 [2024-07-15 11:53:18.256129] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe75d20 00:12:04.681 [2024-07-15 11:53:18.256139] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0xe75d20 00:12:04.681 [2024-07-15 11:53:18.256239] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:04.681 Base_1 00:12:04.681 Base_2 00:12:04.939 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:04.939 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:04.939 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:05.198 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:12:05.198 [2024-07-15 11:53:18.763658] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe755d0 00:12:05.198 /dev/nbd0 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local i 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # break 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:12:05.457 1+0 records in 00:12:05.457 1+0 records out 00:12:05.457 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028061 s, 14.6 MB/s 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # size=4096 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # return 0 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:05.457 11:53:18 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:12:05.715 { 00:12:05.715 "nbd_device": "/dev/nbd0", 00:12:05.715 "bdev_name": "raid" 00:12:05.715 } 00:12:05.715 ]' 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:12:05.715 { 00:12:05.715 "nbd_device": "/dev/nbd0", 00:12:05.715 "bdev_name": "raid" 00:12:05.715 } 00:12:05.715 ]' 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:12:05.715 4096+0 records in 00:12:05.715 4096+0 records out 00:12:05.715 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.031149 s, 67.3 MB/s 00:12:05.715 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:12:06.034 4096+0 records in 00:12:06.034 4096+0 records out 00:12:06.034 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.216136 s, 9.7 MB/s 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:12:06.034 128+0 records in 00:12:06.034 128+0 records out 00:12:06.034 65536 bytes (66 kB, 64 KiB) copied, 0.000824473 s, 79.5 MB/s 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:12:06.034 2035+0 records in 00:12:06.034 2035+0 records out 00:12:06.034 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.011708 s, 89.0 MB/s 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:12:06.034 456+0 records in 00:12:06.034 456+0 records out 00:12:06.034 233472 bytes (233 kB, 228 KiB) copied, 0.0027136 s, 86.0 MB/s 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:06.034 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:06.035 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:06.035 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:12:06.035 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:12:06.035 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:06.035 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:12:06.035 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:12:06.035 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:12:06.035 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:12:06.035 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:12:06.293 [2024-07-15 11:53:19.763533] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:06.293 11:53:19 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 1452660 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@948 -- # '[' -z 1452660 ']' 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@952 -- # kill -0 1452660 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # uname 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:06.552 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1452660 00:12:06.810 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:06.810 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:06.810 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1452660' 00:12:06.810 killing process with pid 1452660 00:12:06.810 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # kill 1452660 00:12:06.810 [2024-07-15 11:53:20.151839] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:06.810 [2024-07-15 11:53:20.151911] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:06.810 [2024-07-15 11:53:20.151956] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:06.810 [2024-07-15 11:53:20.151971] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe75d20 name raid, state offline 00:12:06.810 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@972 -- # wait 1452660 00:12:06.810 [2024-07-15 11:53:20.170339] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:06.810 11:53:20 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:12:06.810 00:12:06.810 real 0m3.393s 00:12:06.810 user 0m4.526s 00:12:06.810 sys 0m1.248s 00:12:06.810 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:06.810 11:53:20 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:12:06.810 ************************************ 00:12:06.810 END TEST raid_function_test_raid0 00:12:06.810 ************************************ 00:12:07.069 11:53:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:07.069 11:53:20 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:12:07.069 11:53:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:12:07.069 11:53:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:07.069 11:53:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:07.069 ************************************ 00:12:07.069 START TEST raid_function_test_concat 00:12:07.069 ************************************ 00:12:07.069 11:53:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1123 -- # raid_function_test concat 00:12:07.069 11:53:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:12:07.069 11:53:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:12:07.069 11:53:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:12:07.069 11:53:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=1453188 00:12:07.069 11:53:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 1453188' 00:12:07.069 Process raid pid: 1453188 00:12:07.069 11:53:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:07.070 11:53:20 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 1453188 /var/tmp/spdk-raid.sock 00:12:07.070 11:53:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@829 -- # '[' -z 1453188 ']' 00:12:07.070 11:53:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:07.070 11:53:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:07.070 11:53:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:07.070 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:07.070 11:53:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:07.070 11:53:20 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:12:07.070 [2024-07-15 11:53:20.534582] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:12:07.070 [2024-07-15 11:53:20.534648] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:07.070 [2024-07-15 11:53:20.665505] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:07.329 [2024-07-15 11:53:20.770187] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:07.329 [2024-07-15 11:53:20.824743] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:07.329 [2024-07-15 11:53:20.824770] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:07.896 11:53:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:07.896 11:53:21 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@862 -- # return 0 00:12:07.896 11:53:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:12:07.896 11:53:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:12:07.896 11:53:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:07.896 11:53:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:12:07.896 11:53:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:12:08.154 [2024-07-15 11:53:21.744590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:08.155 [2024-07-15 11:53:21.746067] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:08.155 [2024-07-15 11:53:21.746124] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bc3d20 00:12:08.155 [2024-07-15 11:53:21.746134] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:08.155 [2024-07-15 11:53:21.746321] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bc35d0 00:12:08.155 [2024-07-15 11:53:21.746439] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bc3d20 00:12:08.155 [2024-07-15 11:53:21.746449] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1bc3d20 00:12:08.155 [2024-07-15 11:53:21.746547] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:08.155 Base_1 00:12:08.155 Base_2 00:12:08.414 11:53:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:12:08.414 11:53:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:12:08.414 11:53:21 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:08.672 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:12:08.672 [2024-07-15 11:53:22.249948] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bc35d0 00:12:08.672 /dev/nbd0 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local i 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # break 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:12:08.931 1+0 records in 00:12:08.931 1+0 records out 00:12:08.931 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274698 s, 14.9 MB/s 00:12:08.931 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:08.932 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # size=4096 00:12:08.932 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:12:08.932 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:12:08.932 11:53:22 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # return 0 00:12:08.932 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:12:08.932 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:12:08.932 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:08.932 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:08.932 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:12:09.191 { 00:12:09.191 "nbd_device": "/dev/nbd0", 00:12:09.191 "bdev_name": "raid" 00:12:09.191 } 00:12:09.191 ]' 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:12:09.191 { 00:12:09.191 "nbd_device": "/dev/nbd0", 00:12:09.191 "bdev_name": "raid" 00:12:09.191 } 00:12:09.191 ]' 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:12:09.191 4096+0 records in 00:12:09.191 4096+0 records out 00:12:09.191 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0310915 s, 67.5 MB/s 00:12:09.191 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:12:09.450 4096+0 records in 00:12:09.450 4096+0 records out 00:12:09.450 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.292673 s, 7.2 MB/s 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:12:09.450 128+0 records in 00:12:09.450 128+0 records out 00:12:09.450 65536 bytes (66 kB, 64 KiB) copied, 0.000846468 s, 77.4 MB/s 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:09.450 11:53:22 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:09.450 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:09.450 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:09.450 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:12:09.450 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:12:09.450 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:12:09.451 2035+0 records in 00:12:09.451 2035+0 records out 00:12:09.451 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0119224 s, 87.4 MB/s 00:12:09.451 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:12:09.451 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:09.451 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:09.451 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:09.451 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:09.451 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:12:09.451 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:12:09.451 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:12:09.710 456+0 records in 00:12:09.710 456+0 records out 00:12:09.710 233472 bytes (233 kB, 228 KiB) copied, 0.0027971 s, 83.5 MB/s 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:12:09.710 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:12:09.969 [2024-07-15 11:53:23.321654] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:12:09.969 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 1453188 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@948 -- # '[' -z 1453188 ']' 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@952 -- # kill -0 1453188 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # uname 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1453188 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1453188' 00:12:10.227 killing process with pid 1453188 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # kill 1453188 00:12:10.227 [2024-07-15 11:53:23.694979] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:10.227 [2024-07-15 11:53:23.695049] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:10.227 [2024-07-15 11:53:23.695096] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:10.227 [2024-07-15 11:53:23.695109] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bc3d20 name raid, state offline 00:12:10.227 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@972 -- # wait 1453188 00:12:10.227 [2024-07-15 11:53:23.714151] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:10.487 11:53:23 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:12:10.487 00:12:10.487 real 0m3.449s 00:12:10.487 user 0m4.552s 00:12:10.487 sys 0m1.284s 00:12:10.487 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:10.487 11:53:23 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:12:10.487 ************************************ 00:12:10.487 END TEST raid_function_test_concat 00:12:10.487 ************************************ 00:12:10.487 11:53:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:10.487 11:53:23 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:12:10.487 11:53:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:12:10.487 11:53:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:10.487 11:53:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:10.487 ************************************ 00:12:10.487 START TEST raid0_resize_test 00:12:10.487 ************************************ 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1123 -- # raid0_resize_test 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=1453796 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 1453796' 00:12:10.487 Process raid pid: 1453796 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 1453796 /var/tmp/spdk-raid.sock 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@829 -- # '[' -z 1453796 ']' 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:10.487 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:10.487 11:53:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.487 [2024-07-15 11:53:24.070470] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:12:10.487 [2024-07-15 11:53:24.070536] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:10.746 [2024-07-15 11:53:24.200993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:10.746 [2024-07-15 11:53:24.310486] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:11.004 [2024-07-15 11:53:24.373480] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:11.004 [2024-07-15 11:53:24.373511] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:11.004 11:53:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:11.004 11:53:24 bdev_raid.raid0_resize_test -- common/autotest_common.sh@862 -- # return 0 00:12:11.005 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:12:11.263 Base_1 00:12:11.263 11:53:24 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:12:11.522 Base_2 00:12:11.523 11:53:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:12:11.781 [2024-07-15 11:53:25.255889] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:12:11.781 [2024-07-15 11:53:25.257372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:12:11.781 [2024-07-15 11:53:25.257421] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14cd800 00:12:11.781 [2024-07-15 11:53:25.257431] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:11.781 [2024-07-15 11:53:25.257637] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1015e40 00:12:11.781 [2024-07-15 11:53:25.257735] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14cd800 00:12:11.781 [2024-07-15 11:53:25.257745] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x14cd800 00:12:11.781 [2024-07-15 11:53:25.257849] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:11.781 11:53:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:12:12.040 [2024-07-15 11:53:25.504620] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:12.040 [2024-07-15 11:53:25.504644] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:12:12.040 true 00:12:12.040 11:53:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:12.040 11:53:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:12:12.300 [2024-07-15 11:53:25.753430] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:12.300 11:53:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:12:12.300 11:53:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:12:12.300 11:53:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:12:12.300 11:53:25 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:12:12.559 [2024-07-15 11:53:26.005915] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:12:12.559 [2024-07-15 11:53:26.005935] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:12:12.559 [2024-07-15 11:53:26.005965] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:12:12.559 true 00:12:12.559 11:53:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:12:12.559 11:53:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:12:12.819 [2024-07-15 11:53:26.250747] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 1453796 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@948 -- # '[' -z 1453796 ']' 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@952 -- # kill -0 1453796 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # uname 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1453796 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1453796' 00:12:12.819 killing process with pid 1453796 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # kill 1453796 00:12:12.819 [2024-07-15 11:53:26.320127] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:12.819 [2024-07-15 11:53:26.320185] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:12.819 [2024-07-15 11:53:26.320231] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:12.819 [2024-07-15 11:53:26.320243] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14cd800 name Raid, state offline 00:12:12.819 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@972 -- # wait 1453796 00:12:12.819 [2024-07-15 11:53:26.321601] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:13.079 11:53:26 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:12:13.079 00:12:13.079 real 0m2.524s 00:12:13.079 user 0m4.151s 00:12:13.079 sys 0m0.673s 00:12:13.079 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:13.079 11:53:26 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.079 ************************************ 00:12:13.079 END TEST raid0_resize_test 00:12:13.079 ************************************ 00:12:13.079 11:53:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:13.079 11:53:26 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:12:13.079 11:53:26 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:13.079 11:53:26 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:12:13.079 11:53:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:13.079 11:53:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:13.079 11:53:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:13.079 ************************************ 00:12:13.079 START TEST raid_state_function_test 00:12:13.079 ************************************ 00:12:13.079 11:53:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 false 00:12:13.079 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:13.079 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:13.079 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1454146 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1454146' 00:12:13.080 Process raid pid: 1454146 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1454146 /var/tmp/spdk-raid.sock 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1454146 ']' 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:13.080 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:13.080 11:53:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:13.339 [2024-07-15 11:53:26.697860] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:12:13.339 [2024-07-15 11:53:26.697933] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:13.339 [2024-07-15 11:53:26.829732] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:13.339 [2024-07-15 11:53:26.933292] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:13.598 [2024-07-15 11:53:26.990875] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:13.598 [2024-07-15 11:53:26.990906] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:14.166 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:14.166 11:53:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:14.166 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:14.426 [2024-07-15 11:53:27.850809] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:14.426 [2024-07-15 11:53:27.850854] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:14.426 [2024-07-15 11:53:27.850864] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:14.426 [2024-07-15 11:53:27.850876] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.426 11:53:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:14.685 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.685 "name": "Existed_Raid", 00:12:14.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.685 "strip_size_kb": 64, 00:12:14.685 "state": "configuring", 00:12:14.685 "raid_level": "raid0", 00:12:14.685 "superblock": false, 00:12:14.685 "num_base_bdevs": 2, 00:12:14.685 "num_base_bdevs_discovered": 0, 00:12:14.685 "num_base_bdevs_operational": 2, 00:12:14.685 "base_bdevs_list": [ 00:12:14.685 { 00:12:14.685 "name": "BaseBdev1", 00:12:14.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.685 "is_configured": false, 00:12:14.685 "data_offset": 0, 00:12:14.685 "data_size": 0 00:12:14.685 }, 00:12:14.685 { 00:12:14.685 "name": "BaseBdev2", 00:12:14.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.685 "is_configured": false, 00:12:14.685 "data_offset": 0, 00:12:14.685 "data_size": 0 00:12:14.685 } 00:12:14.685 ] 00:12:14.685 }' 00:12:14.685 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.685 11:53:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.254 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:15.513 [2024-07-15 11:53:28.901463] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:15.513 [2024-07-15 11:53:28.901495] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x803b00 name Existed_Raid, state configuring 00:12:15.513 11:53:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:15.773 [2024-07-15 11:53:29.142110] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:15.773 [2024-07-15 11:53:29.142146] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:15.773 [2024-07-15 11:53:29.142156] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:15.773 [2024-07-15 11:53:29.142168] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:15.773 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:16.033 [2024-07-15 11:53:29.397823] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:16.033 BaseBdev1 00:12:16.033 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:16.033 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:16.033 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:16.033 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:16.033 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:16.033 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:16.033 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:16.294 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:16.553 [ 00:12:16.553 { 00:12:16.553 "name": "BaseBdev1", 00:12:16.553 "aliases": [ 00:12:16.553 "b1ec8c48-088d-4a8a-bec9-04b7c1d5f36c" 00:12:16.553 ], 00:12:16.553 "product_name": "Malloc disk", 00:12:16.553 "block_size": 512, 00:12:16.553 "num_blocks": 65536, 00:12:16.553 "uuid": "b1ec8c48-088d-4a8a-bec9-04b7c1d5f36c", 00:12:16.553 "assigned_rate_limits": { 00:12:16.553 "rw_ios_per_sec": 0, 00:12:16.553 "rw_mbytes_per_sec": 0, 00:12:16.553 "r_mbytes_per_sec": 0, 00:12:16.553 "w_mbytes_per_sec": 0 00:12:16.553 }, 00:12:16.553 "claimed": true, 00:12:16.553 "claim_type": "exclusive_write", 00:12:16.553 "zoned": false, 00:12:16.553 "supported_io_types": { 00:12:16.553 "read": true, 00:12:16.553 "write": true, 00:12:16.553 "unmap": true, 00:12:16.553 "flush": true, 00:12:16.553 "reset": true, 00:12:16.553 "nvme_admin": false, 00:12:16.553 "nvme_io": false, 00:12:16.553 "nvme_io_md": false, 00:12:16.553 "write_zeroes": true, 00:12:16.553 "zcopy": true, 00:12:16.553 "get_zone_info": false, 00:12:16.553 "zone_management": false, 00:12:16.553 "zone_append": false, 00:12:16.553 "compare": false, 00:12:16.553 "compare_and_write": false, 00:12:16.553 "abort": true, 00:12:16.553 "seek_hole": false, 00:12:16.553 "seek_data": false, 00:12:16.553 "copy": true, 00:12:16.553 "nvme_iov_md": false 00:12:16.553 }, 00:12:16.553 "memory_domains": [ 00:12:16.553 { 00:12:16.553 "dma_device_id": "system", 00:12:16.553 "dma_device_type": 1 00:12:16.553 }, 00:12:16.553 { 00:12:16.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:16.553 "dma_device_type": 2 00:12:16.553 } 00:12:16.553 ], 00:12:16.553 "driver_specific": {} 00:12:16.553 } 00:12:16.553 ] 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.553 11:53:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:16.813 11:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:16.813 "name": "Existed_Raid", 00:12:16.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.813 "strip_size_kb": 64, 00:12:16.813 "state": "configuring", 00:12:16.813 "raid_level": "raid0", 00:12:16.813 "superblock": false, 00:12:16.813 "num_base_bdevs": 2, 00:12:16.813 "num_base_bdevs_discovered": 1, 00:12:16.813 "num_base_bdevs_operational": 2, 00:12:16.813 "base_bdevs_list": [ 00:12:16.813 { 00:12:16.813 "name": "BaseBdev1", 00:12:16.813 "uuid": "b1ec8c48-088d-4a8a-bec9-04b7c1d5f36c", 00:12:16.813 "is_configured": true, 00:12:16.813 "data_offset": 0, 00:12:16.813 "data_size": 65536 00:12:16.813 }, 00:12:16.813 { 00:12:16.813 "name": "BaseBdev2", 00:12:16.813 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.813 "is_configured": false, 00:12:16.813 "data_offset": 0, 00:12:16.813 "data_size": 0 00:12:16.813 } 00:12:16.813 ] 00:12:16.813 }' 00:12:16.813 11:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:16.813 11:53:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.382 11:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:17.382 [2024-07-15 11:53:30.961976] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:17.382 [2024-07-15 11:53:30.962016] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8033d0 name Existed_Raid, state configuring 00:12:17.641 11:53:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:17.641 [2024-07-15 11:53:31.210652] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:17.641 [2024-07-15 11:53:31.212131] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:17.641 [2024-07-15 11:53:31.212162] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.641 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.900 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.900 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:18.159 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:18.159 "name": "Existed_Raid", 00:12:18.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.159 "strip_size_kb": 64, 00:12:18.159 "state": "configuring", 00:12:18.159 "raid_level": "raid0", 00:12:18.159 "superblock": false, 00:12:18.159 "num_base_bdevs": 2, 00:12:18.159 "num_base_bdevs_discovered": 1, 00:12:18.159 "num_base_bdevs_operational": 2, 00:12:18.159 "base_bdevs_list": [ 00:12:18.159 { 00:12:18.159 "name": "BaseBdev1", 00:12:18.159 "uuid": "b1ec8c48-088d-4a8a-bec9-04b7c1d5f36c", 00:12:18.159 "is_configured": true, 00:12:18.159 "data_offset": 0, 00:12:18.159 "data_size": 65536 00:12:18.159 }, 00:12:18.159 { 00:12:18.159 "name": "BaseBdev2", 00:12:18.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:18.159 "is_configured": false, 00:12:18.159 "data_offset": 0, 00:12:18.159 "data_size": 0 00:12:18.159 } 00:12:18.159 ] 00:12:18.159 }' 00:12:18.159 11:53:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:18.159 11:53:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:19.098 11:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:19.098 [2024-07-15 11:53:32.613674] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:19.098 [2024-07-15 11:53:32.613717] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x804150 00:12:19.098 [2024-07-15 11:53:32.613725] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:19.098 [2024-07-15 11:53:32.613916] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x71e420 00:12:19.098 [2024-07-15 11:53:32.614031] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x804150 00:12:19.098 [2024-07-15 11:53:32.614041] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x804150 00:12:19.098 [2024-07-15 11:53:32.614201] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:19.098 BaseBdev2 00:12:19.098 11:53:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:19.098 11:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:19.098 11:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:19.098 11:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:12:19.098 11:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:19.098 11:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:19.098 11:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:19.358 11:53:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:19.617 [ 00:12:19.617 { 00:12:19.617 "name": "BaseBdev2", 00:12:19.617 "aliases": [ 00:12:19.617 "50136762-2539-422a-8ea1-cadd9fd746dc" 00:12:19.617 ], 00:12:19.617 "product_name": "Malloc disk", 00:12:19.617 "block_size": 512, 00:12:19.617 "num_blocks": 65536, 00:12:19.617 "uuid": "50136762-2539-422a-8ea1-cadd9fd746dc", 00:12:19.617 "assigned_rate_limits": { 00:12:19.617 "rw_ios_per_sec": 0, 00:12:19.617 "rw_mbytes_per_sec": 0, 00:12:19.617 "r_mbytes_per_sec": 0, 00:12:19.617 "w_mbytes_per_sec": 0 00:12:19.617 }, 00:12:19.617 "claimed": true, 00:12:19.617 "claim_type": "exclusive_write", 00:12:19.617 "zoned": false, 00:12:19.617 "supported_io_types": { 00:12:19.617 "read": true, 00:12:19.617 "write": true, 00:12:19.617 "unmap": true, 00:12:19.617 "flush": true, 00:12:19.617 "reset": true, 00:12:19.617 "nvme_admin": false, 00:12:19.617 "nvme_io": false, 00:12:19.617 "nvme_io_md": false, 00:12:19.617 "write_zeroes": true, 00:12:19.617 "zcopy": true, 00:12:19.617 "get_zone_info": false, 00:12:19.617 "zone_management": false, 00:12:19.617 "zone_append": false, 00:12:19.617 "compare": false, 00:12:19.617 "compare_and_write": false, 00:12:19.617 "abort": true, 00:12:19.617 "seek_hole": false, 00:12:19.617 "seek_data": false, 00:12:19.617 "copy": true, 00:12:19.617 "nvme_iov_md": false 00:12:19.617 }, 00:12:19.617 "memory_domains": [ 00:12:19.617 { 00:12:19.617 "dma_device_id": "system", 00:12:19.617 "dma_device_type": 1 00:12:19.617 }, 00:12:19.617 { 00:12:19.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.617 "dma_device_type": 2 00:12:19.617 } 00:12:19.617 ], 00:12:19.617 "driver_specific": {} 00:12:19.617 } 00:12:19.617 ] 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.617 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:19.876 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.876 "name": "Existed_Raid", 00:12:19.876 "uuid": "9b01ffd8-04e0-4512-b51d-8318d2b4320b", 00:12:19.876 "strip_size_kb": 64, 00:12:19.876 "state": "online", 00:12:19.876 "raid_level": "raid0", 00:12:19.876 "superblock": false, 00:12:19.876 "num_base_bdevs": 2, 00:12:19.876 "num_base_bdevs_discovered": 2, 00:12:19.876 "num_base_bdevs_operational": 2, 00:12:19.876 "base_bdevs_list": [ 00:12:19.876 { 00:12:19.876 "name": "BaseBdev1", 00:12:19.876 "uuid": "b1ec8c48-088d-4a8a-bec9-04b7c1d5f36c", 00:12:19.876 "is_configured": true, 00:12:19.876 "data_offset": 0, 00:12:19.876 "data_size": 65536 00:12:19.876 }, 00:12:19.876 { 00:12:19.876 "name": "BaseBdev2", 00:12:19.876 "uuid": "50136762-2539-422a-8ea1-cadd9fd746dc", 00:12:19.876 "is_configured": true, 00:12:19.876 "data_offset": 0, 00:12:19.876 "data_size": 65536 00:12:19.876 } 00:12:19.876 ] 00:12:19.876 }' 00:12:19.876 11:53:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.876 11:53:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.811 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:20.811 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:20.812 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:20.812 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:20.812 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:20.812 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:20.812 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:20.812 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:21.071 [2024-07-15 11:53:34.478889] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:21.071 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:21.071 "name": "Existed_Raid", 00:12:21.071 "aliases": [ 00:12:21.071 "9b01ffd8-04e0-4512-b51d-8318d2b4320b" 00:12:21.071 ], 00:12:21.071 "product_name": "Raid Volume", 00:12:21.071 "block_size": 512, 00:12:21.071 "num_blocks": 131072, 00:12:21.071 "uuid": "9b01ffd8-04e0-4512-b51d-8318d2b4320b", 00:12:21.071 "assigned_rate_limits": { 00:12:21.071 "rw_ios_per_sec": 0, 00:12:21.071 "rw_mbytes_per_sec": 0, 00:12:21.071 "r_mbytes_per_sec": 0, 00:12:21.071 "w_mbytes_per_sec": 0 00:12:21.071 }, 00:12:21.071 "claimed": false, 00:12:21.071 "zoned": false, 00:12:21.071 "supported_io_types": { 00:12:21.071 "read": true, 00:12:21.071 "write": true, 00:12:21.071 "unmap": true, 00:12:21.071 "flush": true, 00:12:21.071 "reset": true, 00:12:21.071 "nvme_admin": false, 00:12:21.071 "nvme_io": false, 00:12:21.071 "nvme_io_md": false, 00:12:21.071 "write_zeroes": true, 00:12:21.071 "zcopy": false, 00:12:21.071 "get_zone_info": false, 00:12:21.071 "zone_management": false, 00:12:21.071 "zone_append": false, 00:12:21.071 "compare": false, 00:12:21.071 "compare_and_write": false, 00:12:21.071 "abort": false, 00:12:21.071 "seek_hole": false, 00:12:21.071 "seek_data": false, 00:12:21.071 "copy": false, 00:12:21.071 "nvme_iov_md": false 00:12:21.071 }, 00:12:21.071 "memory_domains": [ 00:12:21.071 { 00:12:21.071 "dma_device_id": "system", 00:12:21.071 "dma_device_type": 1 00:12:21.071 }, 00:12:21.071 { 00:12:21.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.071 "dma_device_type": 2 00:12:21.071 }, 00:12:21.071 { 00:12:21.071 "dma_device_id": "system", 00:12:21.071 "dma_device_type": 1 00:12:21.071 }, 00:12:21.071 { 00:12:21.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.071 "dma_device_type": 2 00:12:21.071 } 00:12:21.071 ], 00:12:21.071 "driver_specific": { 00:12:21.071 "raid": { 00:12:21.071 "uuid": "9b01ffd8-04e0-4512-b51d-8318d2b4320b", 00:12:21.071 "strip_size_kb": 64, 00:12:21.071 "state": "online", 00:12:21.072 "raid_level": "raid0", 00:12:21.072 "superblock": false, 00:12:21.072 "num_base_bdevs": 2, 00:12:21.072 "num_base_bdevs_discovered": 2, 00:12:21.072 "num_base_bdevs_operational": 2, 00:12:21.072 "base_bdevs_list": [ 00:12:21.072 { 00:12:21.072 "name": "BaseBdev1", 00:12:21.072 "uuid": "b1ec8c48-088d-4a8a-bec9-04b7c1d5f36c", 00:12:21.072 "is_configured": true, 00:12:21.072 "data_offset": 0, 00:12:21.072 "data_size": 65536 00:12:21.072 }, 00:12:21.072 { 00:12:21.072 "name": "BaseBdev2", 00:12:21.072 "uuid": "50136762-2539-422a-8ea1-cadd9fd746dc", 00:12:21.072 "is_configured": true, 00:12:21.072 "data_offset": 0, 00:12:21.072 "data_size": 65536 00:12:21.072 } 00:12:21.072 ] 00:12:21.072 } 00:12:21.072 } 00:12:21.072 }' 00:12:21.072 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:21.072 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:21.072 BaseBdev2' 00:12:21.072 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:21.072 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:21.072 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:21.331 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:21.331 "name": "BaseBdev1", 00:12:21.331 "aliases": [ 00:12:21.331 "b1ec8c48-088d-4a8a-bec9-04b7c1d5f36c" 00:12:21.331 ], 00:12:21.331 "product_name": "Malloc disk", 00:12:21.331 "block_size": 512, 00:12:21.331 "num_blocks": 65536, 00:12:21.331 "uuid": "b1ec8c48-088d-4a8a-bec9-04b7c1d5f36c", 00:12:21.331 "assigned_rate_limits": { 00:12:21.331 "rw_ios_per_sec": 0, 00:12:21.331 "rw_mbytes_per_sec": 0, 00:12:21.331 "r_mbytes_per_sec": 0, 00:12:21.331 "w_mbytes_per_sec": 0 00:12:21.331 }, 00:12:21.331 "claimed": true, 00:12:21.331 "claim_type": "exclusive_write", 00:12:21.331 "zoned": false, 00:12:21.331 "supported_io_types": { 00:12:21.331 "read": true, 00:12:21.331 "write": true, 00:12:21.331 "unmap": true, 00:12:21.331 "flush": true, 00:12:21.331 "reset": true, 00:12:21.331 "nvme_admin": false, 00:12:21.331 "nvme_io": false, 00:12:21.331 "nvme_io_md": false, 00:12:21.331 "write_zeroes": true, 00:12:21.331 "zcopy": true, 00:12:21.331 "get_zone_info": false, 00:12:21.331 "zone_management": false, 00:12:21.331 "zone_append": false, 00:12:21.331 "compare": false, 00:12:21.331 "compare_and_write": false, 00:12:21.331 "abort": true, 00:12:21.331 "seek_hole": false, 00:12:21.331 "seek_data": false, 00:12:21.331 "copy": true, 00:12:21.331 "nvme_iov_md": false 00:12:21.331 }, 00:12:21.331 "memory_domains": [ 00:12:21.331 { 00:12:21.331 "dma_device_id": "system", 00:12:21.331 "dma_device_type": 1 00:12:21.331 }, 00:12:21.331 { 00:12:21.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.331 "dma_device_type": 2 00:12:21.331 } 00:12:21.331 ], 00:12:21.331 "driver_specific": {} 00:12:21.331 }' 00:12:21.331 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.331 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.331 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:21.331 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.331 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.590 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:21.590 11:53:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.590 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.590 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:21.590 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.590 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.590 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:21.590 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:21.590 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:21.590 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:21.849 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:21.849 "name": "BaseBdev2", 00:12:21.849 "aliases": [ 00:12:21.849 "50136762-2539-422a-8ea1-cadd9fd746dc" 00:12:21.849 ], 00:12:21.849 "product_name": "Malloc disk", 00:12:21.849 "block_size": 512, 00:12:21.849 "num_blocks": 65536, 00:12:21.849 "uuid": "50136762-2539-422a-8ea1-cadd9fd746dc", 00:12:21.849 "assigned_rate_limits": { 00:12:21.849 "rw_ios_per_sec": 0, 00:12:21.849 "rw_mbytes_per_sec": 0, 00:12:21.849 "r_mbytes_per_sec": 0, 00:12:21.849 "w_mbytes_per_sec": 0 00:12:21.849 }, 00:12:21.849 "claimed": true, 00:12:21.849 "claim_type": "exclusive_write", 00:12:21.849 "zoned": false, 00:12:21.849 "supported_io_types": { 00:12:21.849 "read": true, 00:12:21.849 "write": true, 00:12:21.849 "unmap": true, 00:12:21.849 "flush": true, 00:12:21.849 "reset": true, 00:12:21.849 "nvme_admin": false, 00:12:21.849 "nvme_io": false, 00:12:21.849 "nvme_io_md": false, 00:12:21.849 "write_zeroes": true, 00:12:21.849 "zcopy": true, 00:12:21.849 "get_zone_info": false, 00:12:21.849 "zone_management": false, 00:12:21.849 "zone_append": false, 00:12:21.849 "compare": false, 00:12:21.849 "compare_and_write": false, 00:12:21.849 "abort": true, 00:12:21.849 "seek_hole": false, 00:12:21.849 "seek_data": false, 00:12:21.849 "copy": true, 00:12:21.849 "nvme_iov_md": false 00:12:21.849 }, 00:12:21.849 "memory_domains": [ 00:12:21.849 { 00:12:21.849 "dma_device_id": "system", 00:12:21.849 "dma_device_type": 1 00:12:21.849 }, 00:12:21.849 { 00:12:21.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.849 "dma_device_type": 2 00:12:21.849 } 00:12:21.849 ], 00:12:21.849 "driver_specific": {} 00:12:21.849 }' 00:12:21.849 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.849 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:22.108 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:22.108 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:22.108 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:22.108 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:22.108 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:22.108 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:22.108 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:22.108 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:22.108 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:22.367 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:22.367 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:22.626 [2024-07-15 11:53:35.966625] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:22.626 [2024-07-15 11:53:35.966651] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:22.626 [2024-07-15 11:53:35.966702] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:22.626 11:53:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:22.885 11:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:22.885 "name": "Existed_Raid", 00:12:22.885 "uuid": "9b01ffd8-04e0-4512-b51d-8318d2b4320b", 00:12:22.885 "strip_size_kb": 64, 00:12:22.885 "state": "offline", 00:12:22.885 "raid_level": "raid0", 00:12:22.885 "superblock": false, 00:12:22.885 "num_base_bdevs": 2, 00:12:22.885 "num_base_bdevs_discovered": 1, 00:12:22.885 "num_base_bdevs_operational": 1, 00:12:22.885 "base_bdevs_list": [ 00:12:22.885 { 00:12:22.885 "name": null, 00:12:22.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:22.885 "is_configured": false, 00:12:22.885 "data_offset": 0, 00:12:22.885 "data_size": 65536 00:12:22.885 }, 00:12:22.885 { 00:12:22.885 "name": "BaseBdev2", 00:12:22.885 "uuid": "50136762-2539-422a-8ea1-cadd9fd746dc", 00:12:22.885 "is_configured": true, 00:12:22.885 "data_offset": 0, 00:12:22.885 "data_size": 65536 00:12:22.885 } 00:12:22.885 ] 00:12:22.885 }' 00:12:22.885 11:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:22.885 11:53:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.451 11:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:23.451 11:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:23.451 11:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.451 11:53:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:23.710 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:23.710 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:23.710 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:23.710 [2024-07-15 11:53:37.275897] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:23.710 [2024-07-15 11:53:37.275944] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x804150 name Existed_Raid, state offline 00:12:23.969 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:23.969 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:23.969 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:23.969 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:23.969 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:23.969 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:23.969 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:24.227 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1454146 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1454146 ']' 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1454146 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1454146 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1454146' 00:12:24.228 killing process with pid 1454146 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1454146 00:12:24.228 [2024-07-15 11:53:37.632583] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:24.228 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1454146 00:12:24.228 [2024-07-15 11:53:37.633437] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:24.487 00:12:24.487 real 0m11.212s 00:12:24.487 user 0m20.041s 00:12:24.487 sys 0m2.080s 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.487 ************************************ 00:12:24.487 END TEST raid_state_function_test 00:12:24.487 ************************************ 00:12:24.487 11:53:37 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:24.487 11:53:37 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:12:24.487 11:53:37 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:24.487 11:53:37 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:24.487 11:53:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:24.487 ************************************ 00:12:24.487 START TEST raid_state_function_test_sb 00:12:24.487 ************************************ 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 2 true 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1455825 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1455825' 00:12:24.487 Process raid pid: 1455825 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1455825 /var/tmp/spdk-raid.sock 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1455825 ']' 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:24.487 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:24.487 11:53:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:24.487 [2024-07-15 11:53:37.984986] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:12:24.487 [2024-07-15 11:53:37.985053] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:24.746 [2024-07-15 11:53:38.109221] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:24.746 [2024-07-15 11:53:38.216605] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:24.746 [2024-07-15 11:53:38.289052] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.746 [2024-07-15 11:53:38.289089] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:25.685 11:53:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:25.685 11:53:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:12:25.685 11:53:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:25.685 [2024-07-15 11:53:39.164366] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:25.685 [2024-07-15 11:53:39.164408] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:25.685 [2024-07-15 11:53:39.164419] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:25.685 [2024-07-15 11:53:39.164431] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:25.685 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:25.944 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:25.944 "name": "Existed_Raid", 00:12:25.944 "uuid": "923a6768-c040-4901-9336-d97d8a7bf350", 00:12:25.944 "strip_size_kb": 64, 00:12:25.944 "state": "configuring", 00:12:25.944 "raid_level": "raid0", 00:12:25.944 "superblock": true, 00:12:25.944 "num_base_bdevs": 2, 00:12:25.944 "num_base_bdevs_discovered": 0, 00:12:25.944 "num_base_bdevs_operational": 2, 00:12:25.944 "base_bdevs_list": [ 00:12:25.944 { 00:12:25.944 "name": "BaseBdev1", 00:12:25.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:25.944 "is_configured": false, 00:12:25.944 "data_offset": 0, 00:12:25.944 "data_size": 0 00:12:25.944 }, 00:12:25.944 { 00:12:25.944 "name": "BaseBdev2", 00:12:25.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:25.944 "is_configured": false, 00:12:25.944 "data_offset": 0, 00:12:25.944 "data_size": 0 00:12:25.944 } 00:12:25.944 ] 00:12:25.944 }' 00:12:25.944 11:53:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:25.944 11:53:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:26.510 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:26.768 [2024-07-15 11:53:40.303231] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:26.768 [2024-07-15 11:53:40.303265] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1234b00 name Existed_Raid, state configuring 00:12:26.768 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:27.026 [2024-07-15 11:53:40.551906] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:27.026 [2024-07-15 11:53:40.551938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:27.026 [2024-07-15 11:53:40.551948] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:27.026 [2024-07-15 11:53:40.551959] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:27.026 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:27.285 [2024-07-15 11:53:40.814481] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:27.285 BaseBdev1 00:12:27.285 11:53:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:27.285 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:12:27.285 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:27.285 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:27.285 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:27.285 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:27.285 11:53:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:27.644 11:53:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:27.902 [ 00:12:27.902 { 00:12:27.902 "name": "BaseBdev1", 00:12:27.902 "aliases": [ 00:12:27.902 "eaee1ddf-b49d-4f2a-b034-df79d0658b08" 00:12:27.902 ], 00:12:27.902 "product_name": "Malloc disk", 00:12:27.902 "block_size": 512, 00:12:27.902 "num_blocks": 65536, 00:12:27.902 "uuid": "eaee1ddf-b49d-4f2a-b034-df79d0658b08", 00:12:27.902 "assigned_rate_limits": { 00:12:27.902 "rw_ios_per_sec": 0, 00:12:27.902 "rw_mbytes_per_sec": 0, 00:12:27.902 "r_mbytes_per_sec": 0, 00:12:27.902 "w_mbytes_per_sec": 0 00:12:27.902 }, 00:12:27.902 "claimed": true, 00:12:27.902 "claim_type": "exclusive_write", 00:12:27.902 "zoned": false, 00:12:27.902 "supported_io_types": { 00:12:27.902 "read": true, 00:12:27.902 "write": true, 00:12:27.902 "unmap": true, 00:12:27.902 "flush": true, 00:12:27.902 "reset": true, 00:12:27.902 "nvme_admin": false, 00:12:27.902 "nvme_io": false, 00:12:27.902 "nvme_io_md": false, 00:12:27.902 "write_zeroes": true, 00:12:27.902 "zcopy": true, 00:12:27.902 "get_zone_info": false, 00:12:27.902 "zone_management": false, 00:12:27.902 "zone_append": false, 00:12:27.902 "compare": false, 00:12:27.902 "compare_and_write": false, 00:12:27.902 "abort": true, 00:12:27.902 "seek_hole": false, 00:12:27.902 "seek_data": false, 00:12:27.902 "copy": true, 00:12:27.902 "nvme_iov_md": false 00:12:27.902 }, 00:12:27.902 "memory_domains": [ 00:12:27.902 { 00:12:27.902 "dma_device_id": "system", 00:12:27.902 "dma_device_type": 1 00:12:27.902 }, 00:12:27.902 { 00:12:27.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:27.902 "dma_device_type": 2 00:12:27.902 } 00:12:27.902 ], 00:12:27.902 "driver_specific": {} 00:12:27.902 } 00:12:27.902 ] 00:12:27.902 11:53:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:27.902 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.903 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.160 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.160 "name": "Existed_Raid", 00:12:28.160 "uuid": "44c95596-cb95-486a-9504-27e6eff7c476", 00:12:28.160 "strip_size_kb": 64, 00:12:28.161 "state": "configuring", 00:12:28.161 "raid_level": "raid0", 00:12:28.161 "superblock": true, 00:12:28.161 "num_base_bdevs": 2, 00:12:28.161 "num_base_bdevs_discovered": 1, 00:12:28.161 "num_base_bdevs_operational": 2, 00:12:28.161 "base_bdevs_list": [ 00:12:28.161 { 00:12:28.161 "name": "BaseBdev1", 00:12:28.161 "uuid": "eaee1ddf-b49d-4f2a-b034-df79d0658b08", 00:12:28.161 "is_configured": true, 00:12:28.161 "data_offset": 2048, 00:12:28.161 "data_size": 63488 00:12:28.161 }, 00:12:28.161 { 00:12:28.161 "name": "BaseBdev2", 00:12:28.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.161 "is_configured": false, 00:12:28.161 "data_offset": 0, 00:12:28.161 "data_size": 0 00:12:28.161 } 00:12:28.161 ] 00:12:28.161 }' 00:12:28.161 11:53:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.161 11:53:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:28.726 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:28.983 [2024-07-15 11:53:42.362639] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:28.983 [2024-07-15 11:53:42.362680] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12343d0 name Existed_Raid, state configuring 00:12:28.983 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:29.243 [2024-07-15 11:53:42.615351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:29.243 [2024-07-15 11:53:42.616876] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:29.243 [2024-07-15 11:53:42.616916] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.243 "name": "Existed_Raid", 00:12:29.243 "uuid": "db03cfc3-3d27-425b-b4fe-ddf882bd2d89", 00:12:29.243 "strip_size_kb": 64, 00:12:29.243 "state": "configuring", 00:12:29.243 "raid_level": "raid0", 00:12:29.243 "superblock": true, 00:12:29.243 "num_base_bdevs": 2, 00:12:29.243 "num_base_bdevs_discovered": 1, 00:12:29.243 "num_base_bdevs_operational": 2, 00:12:29.243 "base_bdevs_list": [ 00:12:29.243 { 00:12:29.243 "name": "BaseBdev1", 00:12:29.243 "uuid": "eaee1ddf-b49d-4f2a-b034-df79d0658b08", 00:12:29.243 "is_configured": true, 00:12:29.243 "data_offset": 2048, 00:12:29.243 "data_size": 63488 00:12:29.243 }, 00:12:29.243 { 00:12:29.243 "name": "BaseBdev2", 00:12:29.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.243 "is_configured": false, 00:12:29.243 "data_offset": 0, 00:12:29.243 "data_size": 0 00:12:29.243 } 00:12:29.243 ] 00:12:29.243 }' 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.243 11:53:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:30.179 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:30.179 [2024-07-15 11:53:43.653495] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:30.179 [2024-07-15 11:53:43.653643] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1235150 00:12:30.179 [2024-07-15 11:53:43.653657] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:30.179 [2024-07-15 11:53:43.653853] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x114f420 00:12:30.179 [2024-07-15 11:53:43.653968] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1235150 00:12:30.179 [2024-07-15 11:53:43.653978] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1235150 00:12:30.179 [2024-07-15 11:53:43.654072] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:30.179 BaseBdev2 00:12:30.179 11:53:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:30.179 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:12:30.179 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:12:30.179 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:12:30.179 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:12:30.179 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:12:30.179 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:30.437 11:53:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:30.697 [ 00:12:30.697 { 00:12:30.697 "name": "BaseBdev2", 00:12:30.697 "aliases": [ 00:12:30.697 "64d389f3-9744-461d-9ba6-a72e6434b18e" 00:12:30.697 ], 00:12:30.697 "product_name": "Malloc disk", 00:12:30.697 "block_size": 512, 00:12:30.697 "num_blocks": 65536, 00:12:30.697 "uuid": "64d389f3-9744-461d-9ba6-a72e6434b18e", 00:12:30.697 "assigned_rate_limits": { 00:12:30.697 "rw_ios_per_sec": 0, 00:12:30.697 "rw_mbytes_per_sec": 0, 00:12:30.697 "r_mbytes_per_sec": 0, 00:12:30.697 "w_mbytes_per_sec": 0 00:12:30.697 }, 00:12:30.697 "claimed": true, 00:12:30.697 "claim_type": "exclusive_write", 00:12:30.697 "zoned": false, 00:12:30.697 "supported_io_types": { 00:12:30.697 "read": true, 00:12:30.697 "write": true, 00:12:30.697 "unmap": true, 00:12:30.697 "flush": true, 00:12:30.697 "reset": true, 00:12:30.697 "nvme_admin": false, 00:12:30.697 "nvme_io": false, 00:12:30.697 "nvme_io_md": false, 00:12:30.697 "write_zeroes": true, 00:12:30.697 "zcopy": true, 00:12:30.697 "get_zone_info": false, 00:12:30.697 "zone_management": false, 00:12:30.697 "zone_append": false, 00:12:30.697 "compare": false, 00:12:30.697 "compare_and_write": false, 00:12:30.697 "abort": true, 00:12:30.697 "seek_hole": false, 00:12:30.697 "seek_data": false, 00:12:30.697 "copy": true, 00:12:30.697 "nvme_iov_md": false 00:12:30.697 }, 00:12:30.697 "memory_domains": [ 00:12:30.697 { 00:12:30.697 "dma_device_id": "system", 00:12:30.697 "dma_device_type": 1 00:12:30.697 }, 00:12:30.697 { 00:12:30.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:30.697 "dma_device_type": 2 00:12:30.697 } 00:12:30.697 ], 00:12:30.697 "driver_specific": {} 00:12:30.697 } 00:12:30.697 ] 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:30.697 "name": "Existed_Raid", 00:12:30.697 "uuid": "db03cfc3-3d27-425b-b4fe-ddf882bd2d89", 00:12:30.697 "strip_size_kb": 64, 00:12:30.697 "state": "online", 00:12:30.697 "raid_level": "raid0", 00:12:30.697 "superblock": true, 00:12:30.697 "num_base_bdevs": 2, 00:12:30.697 "num_base_bdevs_discovered": 2, 00:12:30.697 "num_base_bdevs_operational": 2, 00:12:30.697 "base_bdevs_list": [ 00:12:30.697 { 00:12:30.697 "name": "BaseBdev1", 00:12:30.697 "uuid": "eaee1ddf-b49d-4f2a-b034-df79d0658b08", 00:12:30.697 "is_configured": true, 00:12:30.697 "data_offset": 2048, 00:12:30.697 "data_size": 63488 00:12:30.697 }, 00:12:30.697 { 00:12:30.697 "name": "BaseBdev2", 00:12:30.697 "uuid": "64d389f3-9744-461d-9ba6-a72e6434b18e", 00:12:30.697 "is_configured": true, 00:12:30.697 "data_offset": 2048, 00:12:30.697 "data_size": 63488 00:12:30.697 } 00:12:30.697 ] 00:12:30.697 }' 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:30.697 11:53:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:31.266 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:31.266 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:31.266 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:31.266 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:31.266 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:31.266 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:31.266 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:31.266 11:53:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:31.525 [2024-07-15 11:53:44.997345] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:31.525 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:31.525 "name": "Existed_Raid", 00:12:31.525 "aliases": [ 00:12:31.525 "db03cfc3-3d27-425b-b4fe-ddf882bd2d89" 00:12:31.525 ], 00:12:31.525 "product_name": "Raid Volume", 00:12:31.525 "block_size": 512, 00:12:31.525 "num_blocks": 126976, 00:12:31.525 "uuid": "db03cfc3-3d27-425b-b4fe-ddf882bd2d89", 00:12:31.525 "assigned_rate_limits": { 00:12:31.525 "rw_ios_per_sec": 0, 00:12:31.525 "rw_mbytes_per_sec": 0, 00:12:31.525 "r_mbytes_per_sec": 0, 00:12:31.525 "w_mbytes_per_sec": 0 00:12:31.525 }, 00:12:31.525 "claimed": false, 00:12:31.525 "zoned": false, 00:12:31.525 "supported_io_types": { 00:12:31.525 "read": true, 00:12:31.525 "write": true, 00:12:31.525 "unmap": true, 00:12:31.525 "flush": true, 00:12:31.525 "reset": true, 00:12:31.525 "nvme_admin": false, 00:12:31.525 "nvme_io": false, 00:12:31.525 "nvme_io_md": false, 00:12:31.525 "write_zeroes": true, 00:12:31.525 "zcopy": false, 00:12:31.525 "get_zone_info": false, 00:12:31.525 "zone_management": false, 00:12:31.525 "zone_append": false, 00:12:31.525 "compare": false, 00:12:31.525 "compare_and_write": false, 00:12:31.525 "abort": false, 00:12:31.525 "seek_hole": false, 00:12:31.525 "seek_data": false, 00:12:31.525 "copy": false, 00:12:31.525 "nvme_iov_md": false 00:12:31.525 }, 00:12:31.525 "memory_domains": [ 00:12:31.525 { 00:12:31.525 "dma_device_id": "system", 00:12:31.525 "dma_device_type": 1 00:12:31.525 }, 00:12:31.525 { 00:12:31.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.525 "dma_device_type": 2 00:12:31.525 }, 00:12:31.525 { 00:12:31.525 "dma_device_id": "system", 00:12:31.525 "dma_device_type": 1 00:12:31.525 }, 00:12:31.525 { 00:12:31.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.525 "dma_device_type": 2 00:12:31.525 } 00:12:31.525 ], 00:12:31.525 "driver_specific": { 00:12:31.525 "raid": { 00:12:31.525 "uuid": "db03cfc3-3d27-425b-b4fe-ddf882bd2d89", 00:12:31.525 "strip_size_kb": 64, 00:12:31.525 "state": "online", 00:12:31.525 "raid_level": "raid0", 00:12:31.525 "superblock": true, 00:12:31.525 "num_base_bdevs": 2, 00:12:31.525 "num_base_bdevs_discovered": 2, 00:12:31.525 "num_base_bdevs_operational": 2, 00:12:31.525 "base_bdevs_list": [ 00:12:31.525 { 00:12:31.525 "name": "BaseBdev1", 00:12:31.525 "uuid": "eaee1ddf-b49d-4f2a-b034-df79d0658b08", 00:12:31.525 "is_configured": true, 00:12:31.525 "data_offset": 2048, 00:12:31.525 "data_size": 63488 00:12:31.525 }, 00:12:31.525 { 00:12:31.525 "name": "BaseBdev2", 00:12:31.525 "uuid": "64d389f3-9744-461d-9ba6-a72e6434b18e", 00:12:31.525 "is_configured": true, 00:12:31.525 "data_offset": 2048, 00:12:31.525 "data_size": 63488 00:12:31.525 } 00:12:31.525 ] 00:12:31.525 } 00:12:31.525 } 00:12:31.525 }' 00:12:31.525 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:31.525 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:31.525 BaseBdev2' 00:12:31.525 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:31.525 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:31.525 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:31.785 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:31.785 "name": "BaseBdev1", 00:12:31.785 "aliases": [ 00:12:31.785 "eaee1ddf-b49d-4f2a-b034-df79d0658b08" 00:12:31.785 ], 00:12:31.785 "product_name": "Malloc disk", 00:12:31.785 "block_size": 512, 00:12:31.785 "num_blocks": 65536, 00:12:31.785 "uuid": "eaee1ddf-b49d-4f2a-b034-df79d0658b08", 00:12:31.785 "assigned_rate_limits": { 00:12:31.785 "rw_ios_per_sec": 0, 00:12:31.785 "rw_mbytes_per_sec": 0, 00:12:31.785 "r_mbytes_per_sec": 0, 00:12:31.785 "w_mbytes_per_sec": 0 00:12:31.785 }, 00:12:31.785 "claimed": true, 00:12:31.785 "claim_type": "exclusive_write", 00:12:31.785 "zoned": false, 00:12:31.785 "supported_io_types": { 00:12:31.785 "read": true, 00:12:31.785 "write": true, 00:12:31.785 "unmap": true, 00:12:31.785 "flush": true, 00:12:31.785 "reset": true, 00:12:31.785 "nvme_admin": false, 00:12:31.785 "nvme_io": false, 00:12:31.785 "nvme_io_md": false, 00:12:31.785 "write_zeroes": true, 00:12:31.785 "zcopy": true, 00:12:31.785 "get_zone_info": false, 00:12:31.785 "zone_management": false, 00:12:31.785 "zone_append": false, 00:12:31.785 "compare": false, 00:12:31.785 "compare_and_write": false, 00:12:31.785 "abort": true, 00:12:31.785 "seek_hole": false, 00:12:31.785 "seek_data": false, 00:12:31.785 "copy": true, 00:12:31.785 "nvme_iov_md": false 00:12:31.785 }, 00:12:31.785 "memory_domains": [ 00:12:31.785 { 00:12:31.785 "dma_device_id": "system", 00:12:31.785 "dma_device_type": 1 00:12:31.785 }, 00:12:31.785 { 00:12:31.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.785 "dma_device_type": 2 00:12:31.785 } 00:12:31.785 ], 00:12:31.785 "driver_specific": {} 00:12:31.785 }' 00:12:31.785 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:31.785 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.044 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.044 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.044 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.044 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.044 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.044 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.044 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.044 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.044 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.303 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.303 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:32.303 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:32.303 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:32.563 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:32.563 "name": "BaseBdev2", 00:12:32.563 "aliases": [ 00:12:32.563 "64d389f3-9744-461d-9ba6-a72e6434b18e" 00:12:32.563 ], 00:12:32.563 "product_name": "Malloc disk", 00:12:32.563 "block_size": 512, 00:12:32.563 "num_blocks": 65536, 00:12:32.563 "uuid": "64d389f3-9744-461d-9ba6-a72e6434b18e", 00:12:32.563 "assigned_rate_limits": { 00:12:32.563 "rw_ios_per_sec": 0, 00:12:32.563 "rw_mbytes_per_sec": 0, 00:12:32.563 "r_mbytes_per_sec": 0, 00:12:32.563 "w_mbytes_per_sec": 0 00:12:32.563 }, 00:12:32.563 "claimed": true, 00:12:32.563 "claim_type": "exclusive_write", 00:12:32.563 "zoned": false, 00:12:32.563 "supported_io_types": { 00:12:32.563 "read": true, 00:12:32.563 "write": true, 00:12:32.563 "unmap": true, 00:12:32.563 "flush": true, 00:12:32.563 "reset": true, 00:12:32.563 "nvme_admin": false, 00:12:32.563 "nvme_io": false, 00:12:32.563 "nvme_io_md": false, 00:12:32.563 "write_zeroes": true, 00:12:32.563 "zcopy": true, 00:12:32.563 "get_zone_info": false, 00:12:32.563 "zone_management": false, 00:12:32.563 "zone_append": false, 00:12:32.563 "compare": false, 00:12:32.563 "compare_and_write": false, 00:12:32.563 "abort": true, 00:12:32.563 "seek_hole": false, 00:12:32.563 "seek_data": false, 00:12:32.563 "copy": true, 00:12:32.563 "nvme_iov_md": false 00:12:32.563 }, 00:12:32.563 "memory_domains": [ 00:12:32.563 { 00:12:32.563 "dma_device_id": "system", 00:12:32.563 "dma_device_type": 1 00:12:32.563 }, 00:12:32.563 { 00:12:32.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.563 "dma_device_type": 2 00:12:32.563 } 00:12:32.563 ], 00:12:32.563 "driver_specific": {} 00:12:32.563 }' 00:12:32.563 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.563 11:53:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.563 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.563 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.563 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.563 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.563 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.563 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.822 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:32.822 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.822 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:32.822 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:32.822 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:33.082 [2024-07-15 11:53:46.493105] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:33.082 [2024-07-15 11:53:46.493132] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:33.082 [2024-07-15 11:53:46.493174] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.082 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:33.341 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:33.341 "name": "Existed_Raid", 00:12:33.341 "uuid": "db03cfc3-3d27-425b-b4fe-ddf882bd2d89", 00:12:33.341 "strip_size_kb": 64, 00:12:33.341 "state": "offline", 00:12:33.341 "raid_level": "raid0", 00:12:33.341 "superblock": true, 00:12:33.341 "num_base_bdevs": 2, 00:12:33.341 "num_base_bdevs_discovered": 1, 00:12:33.341 "num_base_bdevs_operational": 1, 00:12:33.341 "base_bdevs_list": [ 00:12:33.341 { 00:12:33.341 "name": null, 00:12:33.341 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:33.341 "is_configured": false, 00:12:33.341 "data_offset": 2048, 00:12:33.341 "data_size": 63488 00:12:33.341 }, 00:12:33.341 { 00:12:33.341 "name": "BaseBdev2", 00:12:33.341 "uuid": "64d389f3-9744-461d-9ba6-a72e6434b18e", 00:12:33.341 "is_configured": true, 00:12:33.341 "data_offset": 2048, 00:12:33.341 "data_size": 63488 00:12:33.341 } 00:12:33.341 ] 00:12:33.341 }' 00:12:33.341 11:53:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:33.341 11:53:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:33.910 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:33.910 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:33.910 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:33.910 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:34.169 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:34.169 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:34.169 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:34.169 [2024-07-15 11:53:47.694166] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:34.169 [2024-07-15 11:53:47.694216] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1235150 name Existed_Raid, state offline 00:12:34.169 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:34.169 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:34.169 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:34.169 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1455825 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1455825 ']' 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1455825 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1455825 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1455825' 00:12:34.429 killing process with pid 1455825 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1455825 00:12:34.429 [2024-07-15 11:53:47.954827] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:34.429 11:53:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1455825 00:12:34.429 [2024-07-15 11:53:47.955706] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:34.688 11:53:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:34.688 00:12:34.688 real 0m10.256s 00:12:34.688 user 0m18.243s 00:12:34.688 sys 0m1.860s 00:12:34.688 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:34.688 11:53:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:34.688 ************************************ 00:12:34.688 END TEST raid_state_function_test_sb 00:12:34.688 ************************************ 00:12:34.688 11:53:48 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:34.688 11:53:48 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:12:34.688 11:53:48 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:12:34.688 11:53:48 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:34.688 11:53:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:34.688 ************************************ 00:12:34.688 START TEST raid_superblock_test 00:12:34.688 ************************************ 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 2 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1457360 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1457360 /var/tmp/spdk-raid.sock 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1457360 ']' 00:12:34.688 11:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:34.689 11:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:34.689 11:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:34.689 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:34.689 11:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:34.689 11:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.948 [2024-07-15 11:53:48.320647] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:12:34.948 [2024-07-15 11:53:48.320723] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1457360 ] 00:12:34.948 [2024-07-15 11:53:48.447825] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:35.207 [2024-07-15 11:53:48.553313] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:35.207 [2024-07-15 11:53:48.610213] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:35.207 [2024-07-15 11:53:48.610241] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:35.207 11:53:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:35.465 malloc1 00:12:35.465 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:36.034 [2024-07-15 11:53:49.520402] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:36.034 [2024-07-15 11:53:49.520454] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:36.034 [2024-07-15 11:53:49.520476] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x111a560 00:12:36.034 [2024-07-15 11:53:49.520488] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:36.034 [2024-07-15 11:53:49.522163] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:36.034 [2024-07-15 11:53:49.522192] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:36.034 pt1 00:12:36.034 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:36.034 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:36.034 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:36.034 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:36.034 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:36.034 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:36.034 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:36.034 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:36.034 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:36.293 malloc2 00:12:36.293 11:53:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:36.862 [2024-07-15 11:53:50.292312] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:36.862 [2024-07-15 11:53:50.292359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:36.862 [2024-07-15 11:53:50.292377] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11b85b0 00:12:36.862 [2024-07-15 11:53:50.292389] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:36.862 [2024-07-15 11:53:50.293955] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:36.862 [2024-07-15 11:53:50.293984] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:36.862 pt2 00:12:36.862 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:36.862 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:36.862 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:37.430 [2024-07-15 11:53:50.805669] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:37.430 [2024-07-15 11:53:50.807003] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:37.430 [2024-07-15 11:53:50.807144] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11b9db0 00:12:37.430 [2024-07-15 11:53:50.807157] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:37.430 [2024-07-15 11:53:50.807349] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11bace0 00:12:37.430 [2024-07-15 11:53:50.807485] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11b9db0 00:12:37.430 [2024-07-15 11:53:50.807495] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11b9db0 00:12:37.430 [2024-07-15 11:53:50.807594] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:37.430 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:37.430 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:37.430 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:37.430 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:37.430 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:37.430 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:37.431 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.431 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.431 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.431 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.431 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.431 11:53:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:37.689 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:37.690 "name": "raid_bdev1", 00:12:37.690 "uuid": "431f8a3f-479c-4fe9-9392-5672390abbbd", 00:12:37.690 "strip_size_kb": 64, 00:12:37.690 "state": "online", 00:12:37.690 "raid_level": "raid0", 00:12:37.690 "superblock": true, 00:12:37.690 "num_base_bdevs": 2, 00:12:37.690 "num_base_bdevs_discovered": 2, 00:12:37.690 "num_base_bdevs_operational": 2, 00:12:37.690 "base_bdevs_list": [ 00:12:37.690 { 00:12:37.690 "name": "pt1", 00:12:37.690 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:37.690 "is_configured": true, 00:12:37.690 "data_offset": 2048, 00:12:37.690 "data_size": 63488 00:12:37.690 }, 00:12:37.690 { 00:12:37.690 "name": "pt2", 00:12:37.690 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:37.690 "is_configured": true, 00:12:37.690 "data_offset": 2048, 00:12:37.690 "data_size": 63488 00:12:37.690 } 00:12:37.690 ] 00:12:37.690 }' 00:12:37.690 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:37.690 11:53:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:38.257 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:38.257 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:38.257 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:38.257 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:38.257 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:38.257 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:38.257 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:38.257 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:38.516 [2024-07-15 11:53:51.916857] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:38.516 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:38.516 "name": "raid_bdev1", 00:12:38.516 "aliases": [ 00:12:38.516 "431f8a3f-479c-4fe9-9392-5672390abbbd" 00:12:38.516 ], 00:12:38.516 "product_name": "Raid Volume", 00:12:38.516 "block_size": 512, 00:12:38.516 "num_blocks": 126976, 00:12:38.516 "uuid": "431f8a3f-479c-4fe9-9392-5672390abbbd", 00:12:38.516 "assigned_rate_limits": { 00:12:38.516 "rw_ios_per_sec": 0, 00:12:38.516 "rw_mbytes_per_sec": 0, 00:12:38.516 "r_mbytes_per_sec": 0, 00:12:38.516 "w_mbytes_per_sec": 0 00:12:38.516 }, 00:12:38.516 "claimed": false, 00:12:38.516 "zoned": false, 00:12:38.516 "supported_io_types": { 00:12:38.516 "read": true, 00:12:38.516 "write": true, 00:12:38.516 "unmap": true, 00:12:38.516 "flush": true, 00:12:38.516 "reset": true, 00:12:38.516 "nvme_admin": false, 00:12:38.516 "nvme_io": false, 00:12:38.516 "nvme_io_md": false, 00:12:38.516 "write_zeroes": true, 00:12:38.516 "zcopy": false, 00:12:38.516 "get_zone_info": false, 00:12:38.516 "zone_management": false, 00:12:38.516 "zone_append": false, 00:12:38.516 "compare": false, 00:12:38.516 "compare_and_write": false, 00:12:38.516 "abort": false, 00:12:38.516 "seek_hole": false, 00:12:38.516 "seek_data": false, 00:12:38.516 "copy": false, 00:12:38.516 "nvme_iov_md": false 00:12:38.516 }, 00:12:38.516 "memory_domains": [ 00:12:38.516 { 00:12:38.516 "dma_device_id": "system", 00:12:38.516 "dma_device_type": 1 00:12:38.516 }, 00:12:38.516 { 00:12:38.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.516 "dma_device_type": 2 00:12:38.516 }, 00:12:38.516 { 00:12:38.516 "dma_device_id": "system", 00:12:38.516 "dma_device_type": 1 00:12:38.516 }, 00:12:38.516 { 00:12:38.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.516 "dma_device_type": 2 00:12:38.516 } 00:12:38.516 ], 00:12:38.516 "driver_specific": { 00:12:38.516 "raid": { 00:12:38.516 "uuid": "431f8a3f-479c-4fe9-9392-5672390abbbd", 00:12:38.516 "strip_size_kb": 64, 00:12:38.516 "state": "online", 00:12:38.516 "raid_level": "raid0", 00:12:38.516 "superblock": true, 00:12:38.516 "num_base_bdevs": 2, 00:12:38.516 "num_base_bdevs_discovered": 2, 00:12:38.516 "num_base_bdevs_operational": 2, 00:12:38.516 "base_bdevs_list": [ 00:12:38.516 { 00:12:38.516 "name": "pt1", 00:12:38.516 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:38.516 "is_configured": true, 00:12:38.516 "data_offset": 2048, 00:12:38.516 "data_size": 63488 00:12:38.516 }, 00:12:38.516 { 00:12:38.516 "name": "pt2", 00:12:38.516 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:38.516 "is_configured": true, 00:12:38.516 "data_offset": 2048, 00:12:38.516 "data_size": 63488 00:12:38.516 } 00:12:38.516 ] 00:12:38.516 } 00:12:38.516 } 00:12:38.516 }' 00:12:38.516 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:38.516 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:38.516 pt2' 00:12:38.516 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:38.516 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:38.516 11:53:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:38.775 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:38.776 "name": "pt1", 00:12:38.776 "aliases": [ 00:12:38.776 "00000000-0000-0000-0000-000000000001" 00:12:38.776 ], 00:12:38.776 "product_name": "passthru", 00:12:38.776 "block_size": 512, 00:12:38.776 "num_blocks": 65536, 00:12:38.776 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:38.776 "assigned_rate_limits": { 00:12:38.776 "rw_ios_per_sec": 0, 00:12:38.776 "rw_mbytes_per_sec": 0, 00:12:38.776 "r_mbytes_per_sec": 0, 00:12:38.776 "w_mbytes_per_sec": 0 00:12:38.776 }, 00:12:38.776 "claimed": true, 00:12:38.776 "claim_type": "exclusive_write", 00:12:38.776 "zoned": false, 00:12:38.776 "supported_io_types": { 00:12:38.776 "read": true, 00:12:38.776 "write": true, 00:12:38.776 "unmap": true, 00:12:38.776 "flush": true, 00:12:38.776 "reset": true, 00:12:38.776 "nvme_admin": false, 00:12:38.776 "nvme_io": false, 00:12:38.776 "nvme_io_md": false, 00:12:38.776 "write_zeroes": true, 00:12:38.776 "zcopy": true, 00:12:38.776 "get_zone_info": false, 00:12:38.776 "zone_management": false, 00:12:38.776 "zone_append": false, 00:12:38.776 "compare": false, 00:12:38.776 "compare_and_write": false, 00:12:38.776 "abort": true, 00:12:38.776 "seek_hole": false, 00:12:38.776 "seek_data": false, 00:12:38.776 "copy": true, 00:12:38.776 "nvme_iov_md": false 00:12:38.776 }, 00:12:38.776 "memory_domains": [ 00:12:38.776 { 00:12:38.776 "dma_device_id": "system", 00:12:38.776 "dma_device_type": 1 00:12:38.776 }, 00:12:38.776 { 00:12:38.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.776 "dma_device_type": 2 00:12:38.776 } 00:12:38.776 ], 00:12:38.776 "driver_specific": { 00:12:38.776 "passthru": { 00:12:38.776 "name": "pt1", 00:12:38.776 "base_bdev_name": "malloc1" 00:12:38.776 } 00:12:38.776 } 00:12:38.776 }' 00:12:38.776 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.776 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:38.776 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:38.776 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:38.776 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:39.035 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:39.294 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:39.294 "name": "pt2", 00:12:39.294 "aliases": [ 00:12:39.294 "00000000-0000-0000-0000-000000000002" 00:12:39.294 ], 00:12:39.294 "product_name": "passthru", 00:12:39.294 "block_size": 512, 00:12:39.294 "num_blocks": 65536, 00:12:39.294 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:39.294 "assigned_rate_limits": { 00:12:39.294 "rw_ios_per_sec": 0, 00:12:39.294 "rw_mbytes_per_sec": 0, 00:12:39.294 "r_mbytes_per_sec": 0, 00:12:39.294 "w_mbytes_per_sec": 0 00:12:39.294 }, 00:12:39.294 "claimed": true, 00:12:39.294 "claim_type": "exclusive_write", 00:12:39.294 "zoned": false, 00:12:39.294 "supported_io_types": { 00:12:39.294 "read": true, 00:12:39.294 "write": true, 00:12:39.294 "unmap": true, 00:12:39.294 "flush": true, 00:12:39.294 "reset": true, 00:12:39.294 "nvme_admin": false, 00:12:39.294 "nvme_io": false, 00:12:39.294 "nvme_io_md": false, 00:12:39.294 "write_zeroes": true, 00:12:39.294 "zcopy": true, 00:12:39.294 "get_zone_info": false, 00:12:39.294 "zone_management": false, 00:12:39.294 "zone_append": false, 00:12:39.294 "compare": false, 00:12:39.294 "compare_and_write": false, 00:12:39.294 "abort": true, 00:12:39.294 "seek_hole": false, 00:12:39.294 "seek_data": false, 00:12:39.294 "copy": true, 00:12:39.294 "nvme_iov_md": false 00:12:39.294 }, 00:12:39.294 "memory_domains": [ 00:12:39.294 { 00:12:39.294 "dma_device_id": "system", 00:12:39.294 "dma_device_type": 1 00:12:39.294 }, 00:12:39.294 { 00:12:39.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:39.294 "dma_device_type": 2 00:12:39.294 } 00:12:39.294 ], 00:12:39.294 "driver_specific": { 00:12:39.294 "passthru": { 00:12:39.294 "name": "pt2", 00:12:39.294 "base_bdev_name": "malloc2" 00:12:39.294 } 00:12:39.294 } 00:12:39.294 }' 00:12:39.294 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.294 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:39.553 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:39.553 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.553 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:39.553 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:39.553 11:53:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.553 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:39.553 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:39.553 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.553 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:39.813 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:39.813 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:39.813 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:39.813 [2024-07-15 11:53:53.380708] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:39.813 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=431f8a3f-479c-4fe9-9392-5672390abbbd 00:12:39.813 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 431f8a3f-479c-4fe9-9392-5672390abbbd ']' 00:12:39.813 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:40.380 [2024-07-15 11:53:53.881803] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:40.380 [2024-07-15 11:53:53.881832] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:40.380 [2024-07-15 11:53:53.881888] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:40.380 [2024-07-15 11:53:53.881933] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:40.380 [2024-07-15 11:53:53.881946] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11b9db0 name raid_bdev1, state offline 00:12:40.380 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.380 11:53:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:40.639 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:40.639 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:40.639 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:40.639 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:41.206 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:41.206 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:41.466 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:41.466 11:53:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:41.725 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:12:41.984 [2024-07-15 11:53:55.397775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:41.984 [2024-07-15 11:53:55.399162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:41.984 [2024-07-15 11:53:55.399218] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:41.984 [2024-07-15 11:53:55.399257] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:41.984 [2024-07-15 11:53:55.399276] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:41.984 [2024-07-15 11:53:55.399285] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x111dab0 name raid_bdev1, state configuring 00:12:41.984 request: 00:12:41.984 { 00:12:41.984 "name": "raid_bdev1", 00:12:41.984 "raid_level": "raid0", 00:12:41.984 "base_bdevs": [ 00:12:41.984 "malloc1", 00:12:41.984 "malloc2" 00:12:41.984 ], 00:12:41.984 "strip_size_kb": 64, 00:12:41.984 "superblock": false, 00:12:41.985 "method": "bdev_raid_create", 00:12:41.985 "req_id": 1 00:12:41.985 } 00:12:41.985 Got JSON-RPC error response 00:12:41.985 response: 00:12:41.985 { 00:12:41.985 "code": -17, 00:12:41.985 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:41.985 } 00:12:41.985 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:12:41.985 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:12:41.985 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:12:41.985 11:53:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:12:41.985 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.985 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:12:42.243 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:12:42.243 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:12:42.243 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:42.502 [2024-07-15 11:53:55.887000] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:42.502 [2024-07-15 11:53:55.887044] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:42.502 [2024-07-15 11:53:55.887061] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11baae0 00:12:42.503 [2024-07-15 11:53:55.887073] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:42.503 [2024-07-15 11:53:55.888703] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:42.503 [2024-07-15 11:53:55.888732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:42.503 [2024-07-15 11:53:55.888798] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:42.503 [2024-07-15 11:53:55.888822] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:42.503 pt1 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.503 11:53:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:42.761 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.761 "name": "raid_bdev1", 00:12:42.761 "uuid": "431f8a3f-479c-4fe9-9392-5672390abbbd", 00:12:42.761 "strip_size_kb": 64, 00:12:42.761 "state": "configuring", 00:12:42.761 "raid_level": "raid0", 00:12:42.761 "superblock": true, 00:12:42.761 "num_base_bdevs": 2, 00:12:42.761 "num_base_bdevs_discovered": 1, 00:12:42.761 "num_base_bdevs_operational": 2, 00:12:42.761 "base_bdevs_list": [ 00:12:42.761 { 00:12:42.761 "name": "pt1", 00:12:42.761 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:42.761 "is_configured": true, 00:12:42.761 "data_offset": 2048, 00:12:42.761 "data_size": 63488 00:12:42.761 }, 00:12:42.761 { 00:12:42.761 "name": null, 00:12:42.761 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:42.761 "is_configured": false, 00:12:42.761 "data_offset": 2048, 00:12:42.761 "data_size": 63488 00:12:42.761 } 00:12:42.761 ] 00:12:42.761 }' 00:12:42.761 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.761 11:53:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.329 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:12:43.329 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:12:43.329 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:43.329 11:53:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:43.588 [2024-07-15 11:53:56.981926] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:43.588 [2024-07-15 11:53:56.981974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:43.588 [2024-07-15 11:53:56.981991] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11bb030 00:12:43.588 [2024-07-15 11:53:56.982003] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:43.588 [2024-07-15 11:53:56.982330] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:43.588 [2024-07-15 11:53:56.982348] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:43.588 [2024-07-15 11:53:56.982408] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:43.588 [2024-07-15 11:53:56.982426] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:43.588 [2024-07-15 11:53:56.982519] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11199d0 00:12:43.588 [2024-07-15 11:53:56.982530] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:43.588 [2024-07-15 11:53:56.982707] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x111cc90 00:12:43.588 [2024-07-15 11:53:56.982830] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11199d0 00:12:43.588 [2024-07-15 11:53:56.982840] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11199d0 00:12:43.588 [2024-07-15 11:53:56.982940] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:43.588 pt2 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.588 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:43.847 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:43.847 "name": "raid_bdev1", 00:12:43.847 "uuid": "431f8a3f-479c-4fe9-9392-5672390abbbd", 00:12:43.847 "strip_size_kb": 64, 00:12:43.847 "state": "online", 00:12:43.847 "raid_level": "raid0", 00:12:43.847 "superblock": true, 00:12:43.847 "num_base_bdevs": 2, 00:12:43.847 "num_base_bdevs_discovered": 2, 00:12:43.847 "num_base_bdevs_operational": 2, 00:12:43.847 "base_bdevs_list": [ 00:12:43.847 { 00:12:43.847 "name": "pt1", 00:12:43.847 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:43.847 "is_configured": true, 00:12:43.847 "data_offset": 2048, 00:12:43.847 "data_size": 63488 00:12:43.847 }, 00:12:43.847 { 00:12:43.847 "name": "pt2", 00:12:43.847 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:43.847 "is_configured": true, 00:12:43.847 "data_offset": 2048, 00:12:43.847 "data_size": 63488 00:12:43.847 } 00:12:43.847 ] 00:12:43.847 }' 00:12:43.847 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:43.847 11:53:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:44.416 [2024-07-15 11:53:57.956859] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:44.416 "name": "raid_bdev1", 00:12:44.416 "aliases": [ 00:12:44.416 "431f8a3f-479c-4fe9-9392-5672390abbbd" 00:12:44.416 ], 00:12:44.416 "product_name": "Raid Volume", 00:12:44.416 "block_size": 512, 00:12:44.416 "num_blocks": 126976, 00:12:44.416 "uuid": "431f8a3f-479c-4fe9-9392-5672390abbbd", 00:12:44.416 "assigned_rate_limits": { 00:12:44.416 "rw_ios_per_sec": 0, 00:12:44.416 "rw_mbytes_per_sec": 0, 00:12:44.416 "r_mbytes_per_sec": 0, 00:12:44.416 "w_mbytes_per_sec": 0 00:12:44.416 }, 00:12:44.416 "claimed": false, 00:12:44.416 "zoned": false, 00:12:44.416 "supported_io_types": { 00:12:44.416 "read": true, 00:12:44.416 "write": true, 00:12:44.416 "unmap": true, 00:12:44.416 "flush": true, 00:12:44.416 "reset": true, 00:12:44.416 "nvme_admin": false, 00:12:44.416 "nvme_io": false, 00:12:44.416 "nvme_io_md": false, 00:12:44.416 "write_zeroes": true, 00:12:44.416 "zcopy": false, 00:12:44.416 "get_zone_info": false, 00:12:44.416 "zone_management": false, 00:12:44.416 "zone_append": false, 00:12:44.416 "compare": false, 00:12:44.416 "compare_and_write": false, 00:12:44.416 "abort": false, 00:12:44.416 "seek_hole": false, 00:12:44.416 "seek_data": false, 00:12:44.416 "copy": false, 00:12:44.416 "nvme_iov_md": false 00:12:44.416 }, 00:12:44.416 "memory_domains": [ 00:12:44.416 { 00:12:44.416 "dma_device_id": "system", 00:12:44.416 "dma_device_type": 1 00:12:44.416 }, 00:12:44.416 { 00:12:44.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.416 "dma_device_type": 2 00:12:44.416 }, 00:12:44.416 { 00:12:44.416 "dma_device_id": "system", 00:12:44.416 "dma_device_type": 1 00:12:44.416 }, 00:12:44.416 { 00:12:44.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.416 "dma_device_type": 2 00:12:44.416 } 00:12:44.416 ], 00:12:44.416 "driver_specific": { 00:12:44.416 "raid": { 00:12:44.416 "uuid": "431f8a3f-479c-4fe9-9392-5672390abbbd", 00:12:44.416 "strip_size_kb": 64, 00:12:44.416 "state": "online", 00:12:44.416 "raid_level": "raid0", 00:12:44.416 "superblock": true, 00:12:44.416 "num_base_bdevs": 2, 00:12:44.416 "num_base_bdevs_discovered": 2, 00:12:44.416 "num_base_bdevs_operational": 2, 00:12:44.416 "base_bdevs_list": [ 00:12:44.416 { 00:12:44.416 "name": "pt1", 00:12:44.416 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:44.416 "is_configured": true, 00:12:44.416 "data_offset": 2048, 00:12:44.416 "data_size": 63488 00:12:44.416 }, 00:12:44.416 { 00:12:44.416 "name": "pt2", 00:12:44.416 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:44.416 "is_configured": true, 00:12:44.416 "data_offset": 2048, 00:12:44.416 "data_size": 63488 00:12:44.416 } 00:12:44.416 ] 00:12:44.416 } 00:12:44.416 } 00:12:44.416 }' 00:12:44.416 11:53:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:44.676 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:44.676 pt2' 00:12:44.676 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:44.676 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:44.676 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:44.935 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:44.935 "name": "pt1", 00:12:44.935 "aliases": [ 00:12:44.935 "00000000-0000-0000-0000-000000000001" 00:12:44.935 ], 00:12:44.935 "product_name": "passthru", 00:12:44.935 "block_size": 512, 00:12:44.935 "num_blocks": 65536, 00:12:44.935 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:44.935 "assigned_rate_limits": { 00:12:44.935 "rw_ios_per_sec": 0, 00:12:44.935 "rw_mbytes_per_sec": 0, 00:12:44.935 "r_mbytes_per_sec": 0, 00:12:44.935 "w_mbytes_per_sec": 0 00:12:44.935 }, 00:12:44.935 "claimed": true, 00:12:44.935 "claim_type": "exclusive_write", 00:12:44.935 "zoned": false, 00:12:44.935 "supported_io_types": { 00:12:44.935 "read": true, 00:12:44.935 "write": true, 00:12:44.935 "unmap": true, 00:12:44.935 "flush": true, 00:12:44.935 "reset": true, 00:12:44.935 "nvme_admin": false, 00:12:44.935 "nvme_io": false, 00:12:44.935 "nvme_io_md": false, 00:12:44.935 "write_zeroes": true, 00:12:44.935 "zcopy": true, 00:12:44.935 "get_zone_info": false, 00:12:44.935 "zone_management": false, 00:12:44.935 "zone_append": false, 00:12:44.935 "compare": false, 00:12:44.935 "compare_and_write": false, 00:12:44.935 "abort": true, 00:12:44.935 "seek_hole": false, 00:12:44.935 "seek_data": false, 00:12:44.935 "copy": true, 00:12:44.935 "nvme_iov_md": false 00:12:44.935 }, 00:12:44.935 "memory_domains": [ 00:12:44.935 { 00:12:44.935 "dma_device_id": "system", 00:12:44.935 "dma_device_type": 1 00:12:44.935 }, 00:12:44.935 { 00:12:44.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:44.935 "dma_device_type": 2 00:12:44.935 } 00:12:44.935 ], 00:12:44.935 "driver_specific": { 00:12:44.935 "passthru": { 00:12:44.935 "name": "pt1", 00:12:44.935 "base_bdev_name": "malloc1" 00:12:44.935 } 00:12:44.935 } 00:12:44.935 }' 00:12:44.935 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.935 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:44.935 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:44.935 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.935 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:44.935 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:44.935 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:44.935 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.194 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:45.194 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.194 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.194 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:45.194 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:45.194 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:45.194 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:45.454 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:45.454 "name": "pt2", 00:12:45.454 "aliases": [ 00:12:45.454 "00000000-0000-0000-0000-000000000002" 00:12:45.454 ], 00:12:45.454 "product_name": "passthru", 00:12:45.454 "block_size": 512, 00:12:45.454 "num_blocks": 65536, 00:12:45.454 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:45.454 "assigned_rate_limits": { 00:12:45.454 "rw_ios_per_sec": 0, 00:12:45.454 "rw_mbytes_per_sec": 0, 00:12:45.454 "r_mbytes_per_sec": 0, 00:12:45.454 "w_mbytes_per_sec": 0 00:12:45.454 }, 00:12:45.454 "claimed": true, 00:12:45.454 "claim_type": "exclusive_write", 00:12:45.454 "zoned": false, 00:12:45.454 "supported_io_types": { 00:12:45.454 "read": true, 00:12:45.454 "write": true, 00:12:45.454 "unmap": true, 00:12:45.454 "flush": true, 00:12:45.454 "reset": true, 00:12:45.454 "nvme_admin": false, 00:12:45.454 "nvme_io": false, 00:12:45.454 "nvme_io_md": false, 00:12:45.454 "write_zeroes": true, 00:12:45.454 "zcopy": true, 00:12:45.454 "get_zone_info": false, 00:12:45.454 "zone_management": false, 00:12:45.454 "zone_append": false, 00:12:45.454 "compare": false, 00:12:45.454 "compare_and_write": false, 00:12:45.454 "abort": true, 00:12:45.454 "seek_hole": false, 00:12:45.454 "seek_data": false, 00:12:45.454 "copy": true, 00:12:45.454 "nvme_iov_md": false 00:12:45.454 }, 00:12:45.454 "memory_domains": [ 00:12:45.454 { 00:12:45.454 "dma_device_id": "system", 00:12:45.454 "dma_device_type": 1 00:12:45.454 }, 00:12:45.454 { 00:12:45.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:45.454 "dma_device_type": 2 00:12:45.454 } 00:12:45.454 ], 00:12:45.454 "driver_specific": { 00:12:45.454 "passthru": { 00:12:45.454 "name": "pt2", 00:12:45.454 "base_bdev_name": "malloc2" 00:12:45.454 } 00:12:45.454 } 00:12:45.454 }' 00:12:45.454 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.454 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:45.454 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:45.454 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.454 11:53:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:45.454 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:45.454 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.712 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:45.712 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:45.712 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.712 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:45.712 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:45.712 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:45.712 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:12:45.970 [2024-07-15 11:53:59.444802] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 431f8a3f-479c-4fe9-9392-5672390abbbd '!=' 431f8a3f-479c-4fe9-9392-5672390abbbd ']' 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1457360 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1457360 ']' 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1457360 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1457360 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1457360' 00:12:45.970 killing process with pid 1457360 00:12:45.970 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1457360 00:12:45.971 [2024-07-15 11:53:59.520619] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:45.971 [2024-07-15 11:53:59.520676] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:45.971 [2024-07-15 11:53:59.520725] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:45.971 [2024-07-15 11:53:59.520744] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11199d0 name raid_bdev1, state offline 00:12:45.971 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1457360 00:12:45.971 [2024-07-15 11:53:59.537032] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:46.229 11:53:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:12:46.229 00:12:46.229 real 0m11.485s 00:12:46.229 user 0m20.980s 00:12:46.229 sys 0m2.163s 00:12:46.229 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:46.229 11:53:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.229 ************************************ 00:12:46.229 END TEST raid_superblock_test 00:12:46.229 ************************************ 00:12:46.229 11:53:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:46.229 11:53:59 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:12:46.229 11:53:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:46.229 11:53:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:46.229 11:53:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:46.487 ************************************ 00:12:46.488 START TEST raid_read_error_test 00:12:46.488 ************************************ 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 read 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Q8LfnT52Dt 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1459086 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1459086 /var/tmp/spdk-raid.sock 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1459086 ']' 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:46.488 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:46.488 11:53:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:46.488 [2024-07-15 11:53:59.900810] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:12:46.488 [2024-07-15 11:53:59.900880] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1459086 ] 00:12:46.488 [2024-07-15 11:54:00.034032] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:46.746 [2024-07-15 11:54:00.139721] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:46.746 [2024-07-15 11:54:00.202971] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:46.746 [2024-07-15 11:54:00.203002] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:47.314 11:54:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:47.314 11:54:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:47.314 11:54:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:47.314 11:54:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:47.572 BaseBdev1_malloc 00:12:47.572 11:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:47.830 true 00:12:47.830 11:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:48.089 [2024-07-15 11:54:01.512747] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:48.089 [2024-07-15 11:54:01.512795] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:48.089 [2024-07-15 11:54:01.512816] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18364e0 00:12:48.089 [2024-07-15 11:54:01.512829] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:48.089 [2024-07-15 11:54:01.514538] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:48.089 [2024-07-15 11:54:01.514569] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:48.089 BaseBdev1 00:12:48.089 11:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:48.089 11:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:48.348 BaseBdev2_malloc 00:12:48.348 11:54:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:48.607 true 00:12:48.607 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:48.865 [2024-07-15 11:54:02.248103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:48.865 [2024-07-15 11:54:02.248152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:48.865 [2024-07-15 11:54:02.248173] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x183b7b0 00:12:48.865 [2024-07-15 11:54:02.248191] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:48.865 [2024-07-15 11:54:02.249839] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:48.865 [2024-07-15 11:54:02.249871] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:48.865 BaseBdev2 00:12:48.865 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:49.173 [2024-07-15 11:54:02.496794] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:49.173 [2024-07-15 11:54:02.498210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:49.173 [2024-07-15 11:54:02.498396] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x183ce10 00:12:49.173 [2024-07-15 11:54:02.498409] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:49.173 [2024-07-15 11:54:02.498608] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16912d0 00:12:49.173 [2024-07-15 11:54:02.498770] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x183ce10 00:12:49.173 [2024-07-15 11:54:02.498780] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x183ce10 00:12:49.173 [2024-07-15 11:54:02.498885] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.173 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:49.447 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.447 "name": "raid_bdev1", 00:12:49.447 "uuid": "97d3618e-c74d-4e4c-8a77-597e3631d977", 00:12:49.447 "strip_size_kb": 64, 00:12:49.447 "state": "online", 00:12:49.447 "raid_level": "raid0", 00:12:49.447 "superblock": true, 00:12:49.447 "num_base_bdevs": 2, 00:12:49.447 "num_base_bdevs_discovered": 2, 00:12:49.447 "num_base_bdevs_operational": 2, 00:12:49.447 "base_bdevs_list": [ 00:12:49.447 { 00:12:49.447 "name": "BaseBdev1", 00:12:49.447 "uuid": "bb7cf090-117f-5897-8f9f-b5171450b464", 00:12:49.447 "is_configured": true, 00:12:49.447 "data_offset": 2048, 00:12:49.447 "data_size": 63488 00:12:49.447 }, 00:12:49.447 { 00:12:49.447 "name": "BaseBdev2", 00:12:49.447 "uuid": "074a2d48-268e-5980-89f4-5d0ae7b8afd7", 00:12:49.447 "is_configured": true, 00:12:49.447 "data_offset": 2048, 00:12:49.447 "data_size": 63488 00:12:49.447 } 00:12:49.447 ] 00:12:49.447 }' 00:12:49.447 11:54:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.447 11:54:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:50.015 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:50.015 11:54:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:50.015 [2024-07-15 11:54:03.467641] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1837f70 00:12:50.949 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:51.207 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:51.465 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:51.465 "name": "raid_bdev1", 00:12:51.465 "uuid": "97d3618e-c74d-4e4c-8a77-597e3631d977", 00:12:51.465 "strip_size_kb": 64, 00:12:51.465 "state": "online", 00:12:51.465 "raid_level": "raid0", 00:12:51.465 "superblock": true, 00:12:51.465 "num_base_bdevs": 2, 00:12:51.465 "num_base_bdevs_discovered": 2, 00:12:51.465 "num_base_bdevs_operational": 2, 00:12:51.465 "base_bdevs_list": [ 00:12:51.465 { 00:12:51.465 "name": "BaseBdev1", 00:12:51.465 "uuid": "bb7cf090-117f-5897-8f9f-b5171450b464", 00:12:51.465 "is_configured": true, 00:12:51.465 "data_offset": 2048, 00:12:51.465 "data_size": 63488 00:12:51.465 }, 00:12:51.465 { 00:12:51.465 "name": "BaseBdev2", 00:12:51.465 "uuid": "074a2d48-268e-5980-89f4-5d0ae7b8afd7", 00:12:51.465 "is_configured": true, 00:12:51.465 "data_offset": 2048, 00:12:51.465 "data_size": 63488 00:12:51.465 } 00:12:51.465 ] 00:12:51.465 }' 00:12:51.465 11:54:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:51.465 11:54:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.031 11:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:52.290 [2024-07-15 11:54:05.742243] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:52.290 [2024-07-15 11:54:05.742278] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:52.290 [2024-07-15 11:54:05.745494] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:52.290 [2024-07-15 11:54:05.745524] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:52.290 [2024-07-15 11:54:05.745553] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:52.290 [2024-07-15 11:54:05.745564] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x183ce10 name raid_bdev1, state offline 00:12:52.290 0 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1459086 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1459086 ']' 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1459086 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1459086 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1459086' 00:12:52.290 killing process with pid 1459086 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1459086 00:12:52.290 [2024-07-15 11:54:05.804012] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:52.290 11:54:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1459086 00:12:52.290 [2024-07-15 11:54:05.814338] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Q8LfnT52Dt 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:12:52.549 00:12:52.549 real 0m6.214s 00:12:52.549 user 0m9.722s 00:12:52.549 sys 0m1.081s 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:52.549 11:54:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.549 ************************************ 00:12:52.549 END TEST raid_read_error_test 00:12:52.549 ************************************ 00:12:52.549 11:54:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:52.549 11:54:06 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:12:52.549 11:54:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:52.549 11:54:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:52.549 11:54:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:52.549 ************************************ 00:12:52.549 START TEST raid_write_error_test 00:12:52.549 ************************************ 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 2 write 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.2iT1vBnOcS 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1460528 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1460528 /var/tmp/spdk-raid.sock 00:12:52.549 11:54:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:52.808 11:54:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1460528 ']' 00:12:52.808 11:54:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:52.808 11:54:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:52.808 11:54:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:52.808 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:52.808 11:54:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:52.808 11:54:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.808 [2024-07-15 11:54:06.207082] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:12:52.809 [2024-07-15 11:54:06.207152] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1460528 ] 00:12:52.809 [2024-07-15 11:54:06.336040] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:53.068 [2024-07-15 11:54:06.438757] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:53.068 [2024-07-15 11:54:06.494236] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:53.068 [2024-07-15 11:54:06.494270] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:53.637 11:54:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:53.637 11:54:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:12:53.637 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:53.637 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:53.896 BaseBdev1_malloc 00:12:53.896 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:54.155 true 00:12:54.155 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:54.414 [2024-07-15 11:54:07.844513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:54.414 [2024-07-15 11:54:07.844558] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:54.414 [2024-07-15 11:54:07.844577] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd724e0 00:12:54.414 [2024-07-15 11:54:07.844590] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:54.414 [2024-07-15 11:54:07.846381] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:54.414 [2024-07-15 11:54:07.846409] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:54.414 BaseBdev1 00:12:54.414 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:12:54.414 11:54:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:54.674 BaseBdev2_malloc 00:12:54.674 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:54.932 true 00:12:54.932 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:55.191 [2024-07-15 11:54:08.580263] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:55.191 [2024-07-15 11:54:08.580309] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:55.191 [2024-07-15 11:54:08.580330] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd777b0 00:12:55.191 [2024-07-15 11:54:08.580343] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:55.191 [2024-07-15 11:54:08.581920] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:55.191 [2024-07-15 11:54:08.581947] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:55.191 BaseBdev2 00:12:55.191 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:55.450 [2024-07-15 11:54:08.820923] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:55.450 [2024-07-15 11:54:08.822235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:55.450 [2024-07-15 11:54:08.822419] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd78e10 00:12:55.450 [2024-07-15 11:54:08.822432] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:55.450 [2024-07-15 11:54:08.822625] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbcd2d0 00:12:55.450 [2024-07-15 11:54:08.822781] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd78e10 00:12:55.450 [2024-07-15 11:54:08.822791] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd78e10 00:12:55.450 [2024-07-15 11:54:08.822894] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.450 11:54:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:55.709 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:55.709 "name": "raid_bdev1", 00:12:55.709 "uuid": "6c59d3f4-7a49-4a32-8d7a-656d4fc055af", 00:12:55.709 "strip_size_kb": 64, 00:12:55.709 "state": "online", 00:12:55.709 "raid_level": "raid0", 00:12:55.709 "superblock": true, 00:12:55.709 "num_base_bdevs": 2, 00:12:55.709 "num_base_bdevs_discovered": 2, 00:12:55.709 "num_base_bdevs_operational": 2, 00:12:55.709 "base_bdevs_list": [ 00:12:55.709 { 00:12:55.709 "name": "BaseBdev1", 00:12:55.709 "uuid": "692cedd1-db6d-511b-a24f-67acea580e9a", 00:12:55.709 "is_configured": true, 00:12:55.709 "data_offset": 2048, 00:12:55.709 "data_size": 63488 00:12:55.709 }, 00:12:55.709 { 00:12:55.709 "name": "BaseBdev2", 00:12:55.709 "uuid": "3b24e6a7-f5b0-5a46-a671-6451b7c90942", 00:12:55.709 "is_configured": true, 00:12:55.709 "data_offset": 2048, 00:12:55.709 "data_size": 63488 00:12:55.709 } 00:12:55.709 ] 00:12:55.709 }' 00:12:55.709 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:55.709 11:54:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.275 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:12:56.275 11:54:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:56.275 [2024-07-15 11:54:09.755821] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd73f70 00:12:57.211 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:57.469 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:12:57.469 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:12:57.469 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:12:57.469 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:12:57.469 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:57.469 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:57.469 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:57.469 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:57.470 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:57.470 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:57.470 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:57.470 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:57.470 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:57.470 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:57.470 11:54:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:57.729 11:54:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:57.729 "name": "raid_bdev1", 00:12:57.729 "uuid": "6c59d3f4-7a49-4a32-8d7a-656d4fc055af", 00:12:57.729 "strip_size_kb": 64, 00:12:57.729 "state": "online", 00:12:57.729 "raid_level": "raid0", 00:12:57.729 "superblock": true, 00:12:57.729 "num_base_bdevs": 2, 00:12:57.729 "num_base_bdevs_discovered": 2, 00:12:57.729 "num_base_bdevs_operational": 2, 00:12:57.729 "base_bdevs_list": [ 00:12:57.729 { 00:12:57.729 "name": "BaseBdev1", 00:12:57.729 "uuid": "692cedd1-db6d-511b-a24f-67acea580e9a", 00:12:57.729 "is_configured": true, 00:12:57.729 "data_offset": 2048, 00:12:57.729 "data_size": 63488 00:12:57.729 }, 00:12:57.729 { 00:12:57.729 "name": "BaseBdev2", 00:12:57.729 "uuid": "3b24e6a7-f5b0-5a46-a671-6451b7c90942", 00:12:57.729 "is_configured": true, 00:12:57.729 "data_offset": 2048, 00:12:57.729 "data_size": 63488 00:12:57.729 } 00:12:57.729 ] 00:12:57.729 }' 00:12:57.729 11:54:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:57.729 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.297 11:54:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:58.557 [2024-07-15 11:54:11.916780] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:58.557 [2024-07-15 11:54:11.916815] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:58.557 [2024-07-15 11:54:11.919997] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:58.557 [2024-07-15 11:54:11.920027] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:58.557 [2024-07-15 11:54:11.920055] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:58.557 [2024-07-15 11:54:11.920066] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd78e10 name raid_bdev1, state offline 00:12:58.557 0 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1460528 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1460528 ']' 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1460528 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1460528 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1460528' 00:12:58.557 killing process with pid 1460528 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1460528 00:12:58.557 [2024-07-15 11:54:11.998493] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:58.557 11:54:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1460528 00:12:58.557 [2024-07-15 11:54:12.009253] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.2iT1vBnOcS 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:12:58.817 00:12:58.817 real 0m6.123s 00:12:58.817 user 0m9.552s 00:12:58.817 sys 0m1.074s 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:12:58.817 11:54:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.817 ************************************ 00:12:58.817 END TEST raid_write_error_test 00:12:58.817 ************************************ 00:12:58.817 11:54:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:12:58.817 11:54:12 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:12:58.817 11:54:12 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:12:58.817 11:54:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:12:58.817 11:54:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:12:58.817 11:54:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:58.817 ************************************ 00:12:58.817 START TEST raid_state_function_test 00:12:58.817 ************************************ 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 false 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1461375 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1461375' 00:12:58.817 Process raid pid: 1461375 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1461375 /var/tmp/spdk-raid.sock 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1461375 ']' 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:58.817 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:58.817 11:54:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:12:58.818 11:54:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:58.818 [2024-07-15 11:54:12.409590] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:12:58.818 [2024-07-15 11:54:12.409661] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:59.077 [2024-07-15 11:54:12.542189] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:59.077 [2024-07-15 11:54:12.656245] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:59.337 [2024-07-15 11:54:12.725196] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.337 [2024-07-15 11:54:12.725229] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:59.905 [2024-07-15 11:54:13.436031] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:59.905 [2024-07-15 11:54:13.436074] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:59.905 [2024-07-15 11:54:13.436085] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:59.905 [2024-07-15 11:54:13.436097] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.905 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:00.164 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.164 "name": "Existed_Raid", 00:13:00.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.164 "strip_size_kb": 64, 00:13:00.164 "state": "configuring", 00:13:00.164 "raid_level": "concat", 00:13:00.164 "superblock": false, 00:13:00.164 "num_base_bdevs": 2, 00:13:00.164 "num_base_bdevs_discovered": 0, 00:13:00.164 "num_base_bdevs_operational": 2, 00:13:00.164 "base_bdevs_list": [ 00:13:00.164 { 00:13:00.164 "name": "BaseBdev1", 00:13:00.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.164 "is_configured": false, 00:13:00.164 "data_offset": 0, 00:13:00.164 "data_size": 0 00:13:00.164 }, 00:13:00.164 { 00:13:00.164 "name": "BaseBdev2", 00:13:00.164 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:00.164 "is_configured": false, 00:13:00.164 "data_offset": 0, 00:13:00.164 "data_size": 0 00:13:00.164 } 00:13:00.164 ] 00:13:00.164 }' 00:13:00.164 11:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.165 11:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.732 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:00.991 [2024-07-15 11:54:14.438572] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:00.991 [2024-07-15 11:54:14.438604] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10c2b00 name Existed_Raid, state configuring 00:13:00.991 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:01.250 [2024-07-15 11:54:14.683226] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:01.250 [2024-07-15 11:54:14.683254] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:01.250 [2024-07-15 11:54:14.683263] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:01.250 [2024-07-15 11:54:14.683275] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:01.250 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:01.509 [2024-07-15 11:54:14.941719] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:01.509 BaseBdev1 00:13:01.509 11:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:01.509 11:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:01.509 11:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:01.509 11:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:01.509 11:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:01.509 11:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:01.509 11:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:01.768 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:02.028 [ 00:13:02.028 { 00:13:02.028 "name": "BaseBdev1", 00:13:02.028 "aliases": [ 00:13:02.028 "0cad8088-214a-4357-8075-10140f3d303b" 00:13:02.028 ], 00:13:02.028 "product_name": "Malloc disk", 00:13:02.028 "block_size": 512, 00:13:02.028 "num_blocks": 65536, 00:13:02.028 "uuid": "0cad8088-214a-4357-8075-10140f3d303b", 00:13:02.028 "assigned_rate_limits": { 00:13:02.028 "rw_ios_per_sec": 0, 00:13:02.028 "rw_mbytes_per_sec": 0, 00:13:02.028 "r_mbytes_per_sec": 0, 00:13:02.028 "w_mbytes_per_sec": 0 00:13:02.028 }, 00:13:02.028 "claimed": true, 00:13:02.028 "claim_type": "exclusive_write", 00:13:02.028 "zoned": false, 00:13:02.028 "supported_io_types": { 00:13:02.028 "read": true, 00:13:02.028 "write": true, 00:13:02.028 "unmap": true, 00:13:02.028 "flush": true, 00:13:02.028 "reset": true, 00:13:02.028 "nvme_admin": false, 00:13:02.028 "nvme_io": false, 00:13:02.028 "nvme_io_md": false, 00:13:02.028 "write_zeroes": true, 00:13:02.028 "zcopy": true, 00:13:02.028 "get_zone_info": false, 00:13:02.028 "zone_management": false, 00:13:02.028 "zone_append": false, 00:13:02.028 "compare": false, 00:13:02.028 "compare_and_write": false, 00:13:02.028 "abort": true, 00:13:02.028 "seek_hole": false, 00:13:02.028 "seek_data": false, 00:13:02.028 "copy": true, 00:13:02.028 "nvme_iov_md": false 00:13:02.028 }, 00:13:02.028 "memory_domains": [ 00:13:02.028 { 00:13:02.028 "dma_device_id": "system", 00:13:02.028 "dma_device_type": 1 00:13:02.028 }, 00:13:02.028 { 00:13:02.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.028 "dma_device_type": 2 00:13:02.028 } 00:13:02.028 ], 00:13:02.028 "driver_specific": {} 00:13:02.028 } 00:13:02.028 ] 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.028 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:02.288 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.288 "name": "Existed_Raid", 00:13:02.288 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.288 "strip_size_kb": 64, 00:13:02.288 "state": "configuring", 00:13:02.288 "raid_level": "concat", 00:13:02.288 "superblock": false, 00:13:02.288 "num_base_bdevs": 2, 00:13:02.288 "num_base_bdevs_discovered": 1, 00:13:02.288 "num_base_bdevs_operational": 2, 00:13:02.288 "base_bdevs_list": [ 00:13:02.288 { 00:13:02.288 "name": "BaseBdev1", 00:13:02.288 "uuid": "0cad8088-214a-4357-8075-10140f3d303b", 00:13:02.288 "is_configured": true, 00:13:02.288 "data_offset": 0, 00:13:02.288 "data_size": 65536 00:13:02.288 }, 00:13:02.288 { 00:13:02.288 "name": "BaseBdev2", 00:13:02.288 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:02.288 "is_configured": false, 00:13:02.288 "data_offset": 0, 00:13:02.288 "data_size": 0 00:13:02.288 } 00:13:02.288 ] 00:13:02.288 }' 00:13:02.288 11:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.288 11:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.856 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:03.115 [2024-07-15 11:54:16.521904] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:03.115 [2024-07-15 11:54:16.521944] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10c23d0 name Existed_Raid, state configuring 00:13:03.115 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:03.115 [2024-07-15 11:54:16.698403] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:03.115 [2024-07-15 11:54:16.699920] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:03.115 [2024-07-15 11:54:16.699951] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.375 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:03.634 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.634 "name": "Existed_Raid", 00:13:03.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:03.634 "strip_size_kb": 64, 00:13:03.634 "state": "configuring", 00:13:03.634 "raid_level": "concat", 00:13:03.634 "superblock": false, 00:13:03.634 "num_base_bdevs": 2, 00:13:03.634 "num_base_bdevs_discovered": 1, 00:13:03.634 "num_base_bdevs_operational": 2, 00:13:03.634 "base_bdevs_list": [ 00:13:03.634 { 00:13:03.634 "name": "BaseBdev1", 00:13:03.634 "uuid": "0cad8088-214a-4357-8075-10140f3d303b", 00:13:03.634 "is_configured": true, 00:13:03.634 "data_offset": 0, 00:13:03.634 "data_size": 65536 00:13:03.634 }, 00:13:03.634 { 00:13:03.634 "name": "BaseBdev2", 00:13:03.634 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:03.634 "is_configured": false, 00:13:03.634 "data_offset": 0, 00:13:03.634 "data_size": 0 00:13:03.634 } 00:13:03.634 ] 00:13:03.634 }' 00:13:03.634 11:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.634 11:54:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.202 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:04.462 [2024-07-15 11:54:17.800737] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:04.462 [2024-07-15 11:54:17.800775] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10c3150 00:13:04.462 [2024-07-15 11:54:17.800783] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:13:04.462 [2024-07-15 11:54:17.800978] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfdd420 00:13:04.462 [2024-07-15 11:54:17.801096] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10c3150 00:13:04.462 [2024-07-15 11:54:17.801106] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x10c3150 00:13:04.462 [2024-07-15 11:54:17.801270] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:04.462 BaseBdev2 00:13:04.462 11:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:04.462 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:04.462 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:04.462 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:04.462 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:04.462 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:04.462 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:04.462 11:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:04.721 [ 00:13:04.721 { 00:13:04.721 "name": "BaseBdev2", 00:13:04.721 "aliases": [ 00:13:04.721 "b2e8103d-6dfc-4f9b-8f25-c53ea88ccf0e" 00:13:04.721 ], 00:13:04.721 "product_name": "Malloc disk", 00:13:04.721 "block_size": 512, 00:13:04.721 "num_blocks": 65536, 00:13:04.721 "uuid": "b2e8103d-6dfc-4f9b-8f25-c53ea88ccf0e", 00:13:04.721 "assigned_rate_limits": { 00:13:04.721 "rw_ios_per_sec": 0, 00:13:04.721 "rw_mbytes_per_sec": 0, 00:13:04.721 "r_mbytes_per_sec": 0, 00:13:04.721 "w_mbytes_per_sec": 0 00:13:04.721 }, 00:13:04.721 "claimed": true, 00:13:04.721 "claim_type": "exclusive_write", 00:13:04.721 "zoned": false, 00:13:04.721 "supported_io_types": { 00:13:04.721 "read": true, 00:13:04.721 "write": true, 00:13:04.721 "unmap": true, 00:13:04.721 "flush": true, 00:13:04.721 "reset": true, 00:13:04.721 "nvme_admin": false, 00:13:04.721 "nvme_io": false, 00:13:04.721 "nvme_io_md": false, 00:13:04.721 "write_zeroes": true, 00:13:04.721 "zcopy": true, 00:13:04.721 "get_zone_info": false, 00:13:04.721 "zone_management": false, 00:13:04.721 "zone_append": false, 00:13:04.721 "compare": false, 00:13:04.721 "compare_and_write": false, 00:13:04.721 "abort": true, 00:13:04.721 "seek_hole": false, 00:13:04.722 "seek_data": false, 00:13:04.722 "copy": true, 00:13:04.722 "nvme_iov_md": false 00:13:04.722 }, 00:13:04.722 "memory_domains": [ 00:13:04.722 { 00:13:04.722 "dma_device_id": "system", 00:13:04.722 "dma_device_type": 1 00:13:04.722 }, 00:13:04.722 { 00:13:04.722 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:04.722 "dma_device_type": 2 00:13:04.722 } 00:13:04.722 ], 00:13:04.722 "driver_specific": {} 00:13:04.722 } 00:13:04.722 ] 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:04.722 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.981 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:04.981 "name": "Existed_Raid", 00:13:04.981 "uuid": "53de6092-abd1-4724-a217-7d84dec905e1", 00:13:04.981 "strip_size_kb": 64, 00:13:04.981 "state": "online", 00:13:04.981 "raid_level": "concat", 00:13:04.981 "superblock": false, 00:13:04.981 "num_base_bdevs": 2, 00:13:04.981 "num_base_bdevs_discovered": 2, 00:13:04.981 "num_base_bdevs_operational": 2, 00:13:04.981 "base_bdevs_list": [ 00:13:04.981 { 00:13:04.981 "name": "BaseBdev1", 00:13:04.981 "uuid": "0cad8088-214a-4357-8075-10140f3d303b", 00:13:04.981 "is_configured": true, 00:13:04.981 "data_offset": 0, 00:13:04.981 "data_size": 65536 00:13:04.981 }, 00:13:04.981 { 00:13:04.981 "name": "BaseBdev2", 00:13:04.981 "uuid": "b2e8103d-6dfc-4f9b-8f25-c53ea88ccf0e", 00:13:04.981 "is_configured": true, 00:13:04.981 "data_offset": 0, 00:13:04.981 "data_size": 65536 00:13:04.981 } 00:13:04.981 ] 00:13:04.981 }' 00:13:04.981 11:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:04.981 11:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:05.549 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:05.549 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:05.549 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:05.549 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:05.549 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:05.550 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:05.550 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:05.550 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:05.809 [2024-07-15 11:54:19.309010] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:05.809 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:05.809 "name": "Existed_Raid", 00:13:05.809 "aliases": [ 00:13:05.809 "53de6092-abd1-4724-a217-7d84dec905e1" 00:13:05.809 ], 00:13:05.809 "product_name": "Raid Volume", 00:13:05.809 "block_size": 512, 00:13:05.809 "num_blocks": 131072, 00:13:05.809 "uuid": "53de6092-abd1-4724-a217-7d84dec905e1", 00:13:05.809 "assigned_rate_limits": { 00:13:05.809 "rw_ios_per_sec": 0, 00:13:05.809 "rw_mbytes_per_sec": 0, 00:13:05.809 "r_mbytes_per_sec": 0, 00:13:05.809 "w_mbytes_per_sec": 0 00:13:05.809 }, 00:13:05.809 "claimed": false, 00:13:05.809 "zoned": false, 00:13:05.809 "supported_io_types": { 00:13:05.809 "read": true, 00:13:05.809 "write": true, 00:13:05.809 "unmap": true, 00:13:05.809 "flush": true, 00:13:05.809 "reset": true, 00:13:05.809 "nvme_admin": false, 00:13:05.809 "nvme_io": false, 00:13:05.809 "nvme_io_md": false, 00:13:05.809 "write_zeroes": true, 00:13:05.809 "zcopy": false, 00:13:05.809 "get_zone_info": false, 00:13:05.809 "zone_management": false, 00:13:05.809 "zone_append": false, 00:13:05.809 "compare": false, 00:13:05.809 "compare_and_write": false, 00:13:05.809 "abort": false, 00:13:05.809 "seek_hole": false, 00:13:05.809 "seek_data": false, 00:13:05.809 "copy": false, 00:13:05.809 "nvme_iov_md": false 00:13:05.809 }, 00:13:05.809 "memory_domains": [ 00:13:05.809 { 00:13:05.809 "dma_device_id": "system", 00:13:05.809 "dma_device_type": 1 00:13:05.809 }, 00:13:05.809 { 00:13:05.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.809 "dma_device_type": 2 00:13:05.809 }, 00:13:05.809 { 00:13:05.809 "dma_device_id": "system", 00:13:05.809 "dma_device_type": 1 00:13:05.809 }, 00:13:05.809 { 00:13:05.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.809 "dma_device_type": 2 00:13:05.809 } 00:13:05.809 ], 00:13:05.809 "driver_specific": { 00:13:05.809 "raid": { 00:13:05.809 "uuid": "53de6092-abd1-4724-a217-7d84dec905e1", 00:13:05.809 "strip_size_kb": 64, 00:13:05.809 "state": "online", 00:13:05.809 "raid_level": "concat", 00:13:05.809 "superblock": false, 00:13:05.809 "num_base_bdevs": 2, 00:13:05.809 "num_base_bdevs_discovered": 2, 00:13:05.809 "num_base_bdevs_operational": 2, 00:13:05.809 "base_bdevs_list": [ 00:13:05.809 { 00:13:05.809 "name": "BaseBdev1", 00:13:05.809 "uuid": "0cad8088-214a-4357-8075-10140f3d303b", 00:13:05.809 "is_configured": true, 00:13:05.809 "data_offset": 0, 00:13:05.809 "data_size": 65536 00:13:05.809 }, 00:13:05.809 { 00:13:05.809 "name": "BaseBdev2", 00:13:05.809 "uuid": "b2e8103d-6dfc-4f9b-8f25-c53ea88ccf0e", 00:13:05.809 "is_configured": true, 00:13:05.809 "data_offset": 0, 00:13:05.809 "data_size": 65536 00:13:05.809 } 00:13:05.809 ] 00:13:05.809 } 00:13:05.809 } 00:13:05.809 }' 00:13:05.809 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:05.809 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:05.809 BaseBdev2' 00:13:05.809 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:05.809 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:05.809 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:06.069 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:06.069 "name": "BaseBdev1", 00:13:06.069 "aliases": [ 00:13:06.069 "0cad8088-214a-4357-8075-10140f3d303b" 00:13:06.069 ], 00:13:06.069 "product_name": "Malloc disk", 00:13:06.069 "block_size": 512, 00:13:06.069 "num_blocks": 65536, 00:13:06.069 "uuid": "0cad8088-214a-4357-8075-10140f3d303b", 00:13:06.069 "assigned_rate_limits": { 00:13:06.069 "rw_ios_per_sec": 0, 00:13:06.069 "rw_mbytes_per_sec": 0, 00:13:06.069 "r_mbytes_per_sec": 0, 00:13:06.069 "w_mbytes_per_sec": 0 00:13:06.069 }, 00:13:06.069 "claimed": true, 00:13:06.069 "claim_type": "exclusive_write", 00:13:06.069 "zoned": false, 00:13:06.069 "supported_io_types": { 00:13:06.069 "read": true, 00:13:06.069 "write": true, 00:13:06.069 "unmap": true, 00:13:06.069 "flush": true, 00:13:06.069 "reset": true, 00:13:06.069 "nvme_admin": false, 00:13:06.069 "nvme_io": false, 00:13:06.069 "nvme_io_md": false, 00:13:06.069 "write_zeroes": true, 00:13:06.069 "zcopy": true, 00:13:06.069 "get_zone_info": false, 00:13:06.069 "zone_management": false, 00:13:06.069 "zone_append": false, 00:13:06.069 "compare": false, 00:13:06.069 "compare_and_write": false, 00:13:06.069 "abort": true, 00:13:06.069 "seek_hole": false, 00:13:06.069 "seek_data": false, 00:13:06.069 "copy": true, 00:13:06.069 "nvme_iov_md": false 00:13:06.069 }, 00:13:06.069 "memory_domains": [ 00:13:06.069 { 00:13:06.069 "dma_device_id": "system", 00:13:06.069 "dma_device_type": 1 00:13:06.069 }, 00:13:06.069 { 00:13:06.069 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.069 "dma_device_type": 2 00:13:06.069 } 00:13:06.069 ], 00:13:06.069 "driver_specific": {} 00:13:06.069 }' 00:13:06.069 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.329 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.329 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:06.329 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.329 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.329 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:06.329 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.588 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.588 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:06.588 11:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.588 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.588 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:06.588 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:06.588 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:06.588 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:07.157 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:07.157 "name": "BaseBdev2", 00:13:07.157 "aliases": [ 00:13:07.157 "b2e8103d-6dfc-4f9b-8f25-c53ea88ccf0e" 00:13:07.157 ], 00:13:07.157 "product_name": "Malloc disk", 00:13:07.157 "block_size": 512, 00:13:07.157 "num_blocks": 65536, 00:13:07.157 "uuid": "b2e8103d-6dfc-4f9b-8f25-c53ea88ccf0e", 00:13:07.157 "assigned_rate_limits": { 00:13:07.157 "rw_ios_per_sec": 0, 00:13:07.157 "rw_mbytes_per_sec": 0, 00:13:07.157 "r_mbytes_per_sec": 0, 00:13:07.157 "w_mbytes_per_sec": 0 00:13:07.157 }, 00:13:07.157 "claimed": true, 00:13:07.157 "claim_type": "exclusive_write", 00:13:07.157 "zoned": false, 00:13:07.157 "supported_io_types": { 00:13:07.157 "read": true, 00:13:07.157 "write": true, 00:13:07.157 "unmap": true, 00:13:07.157 "flush": true, 00:13:07.157 "reset": true, 00:13:07.157 "nvme_admin": false, 00:13:07.157 "nvme_io": false, 00:13:07.157 "nvme_io_md": false, 00:13:07.157 "write_zeroes": true, 00:13:07.157 "zcopy": true, 00:13:07.157 "get_zone_info": false, 00:13:07.157 "zone_management": false, 00:13:07.157 "zone_append": false, 00:13:07.157 "compare": false, 00:13:07.157 "compare_and_write": false, 00:13:07.157 "abort": true, 00:13:07.157 "seek_hole": false, 00:13:07.157 "seek_data": false, 00:13:07.157 "copy": true, 00:13:07.157 "nvme_iov_md": false 00:13:07.157 }, 00:13:07.157 "memory_domains": [ 00:13:07.157 { 00:13:07.157 "dma_device_id": "system", 00:13:07.157 "dma_device_type": 1 00:13:07.157 }, 00:13:07.157 { 00:13:07.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:07.157 "dma_device_type": 2 00:13:07.157 } 00:13:07.157 ], 00:13:07.157 "driver_specific": {} 00:13:07.157 }' 00:13:07.157 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:07.157 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:07.157 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:07.417 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:07.417 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:07.417 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:07.417 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:07.417 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:07.417 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:07.417 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:07.417 11:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:07.675 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:07.675 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:07.935 [2024-07-15 11:54:21.518882] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:07.935 [2024-07-15 11:54:21.518914] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:07.935 [2024-07-15 11:54:21.518958] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:08.195 11:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:08.763 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:08.763 "name": "Existed_Raid", 00:13:08.763 "uuid": "53de6092-abd1-4724-a217-7d84dec905e1", 00:13:08.763 "strip_size_kb": 64, 00:13:08.763 "state": "offline", 00:13:08.764 "raid_level": "concat", 00:13:08.764 "superblock": false, 00:13:08.764 "num_base_bdevs": 2, 00:13:08.764 "num_base_bdevs_discovered": 1, 00:13:08.764 "num_base_bdevs_operational": 1, 00:13:08.764 "base_bdevs_list": [ 00:13:08.764 { 00:13:08.764 "name": null, 00:13:08.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:08.764 "is_configured": false, 00:13:08.764 "data_offset": 0, 00:13:08.764 "data_size": 65536 00:13:08.764 }, 00:13:08.764 { 00:13:08.764 "name": "BaseBdev2", 00:13:08.764 "uuid": "b2e8103d-6dfc-4f9b-8f25-c53ea88ccf0e", 00:13:08.764 "is_configured": true, 00:13:08.764 "data_offset": 0, 00:13:08.764 "data_size": 65536 00:13:08.764 } 00:13:08.764 ] 00:13:08.764 }' 00:13:08.764 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:08.764 11:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:09.331 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:09.331 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:09.331 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:09.331 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.331 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:09.331 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:09.331 11:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:09.591 [2024-07-15 11:54:23.145066] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:09.591 [2024-07-15 11:54:23.145117] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10c3150 name Existed_Raid, state offline 00:13:09.591 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:09.591 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:09.591 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:09.591 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:09.849 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:09.849 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:09.849 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:09.849 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1461375 00:13:09.849 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1461375 ']' 00:13:09.849 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1461375 00:13:09.849 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:09.849 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:09.849 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1461375 00:13:10.108 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:10.108 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:10.108 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1461375' 00:13:10.108 killing process with pid 1461375 00:13:10.108 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1461375 00:13:10.108 [2024-07-15 11:54:23.476148] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:10.108 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1461375 00:13:10.108 [2024-07-15 11:54:23.477034] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:10.108 11:54:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:10.108 00:13:10.108 real 0m11.360s 00:13:10.108 user 0m20.225s 00:13:10.108 sys 0m2.052s 00:13:10.108 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:10.108 11:54:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.108 ************************************ 00:13:10.108 END TEST raid_state_function_test 00:13:10.108 ************************************ 00:13:10.367 11:54:23 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:10.367 11:54:23 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:13:10.367 11:54:23 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:10.367 11:54:23 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:10.367 11:54:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:10.367 ************************************ 00:13:10.367 START TEST raid_state_function_test_sb 00:13:10.367 ************************************ 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 2 true 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1463137 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1463137' 00:13:10.367 Process raid pid: 1463137 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1463137 /var/tmp/spdk-raid.sock 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1463137 ']' 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:10.367 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:10.367 11:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:10.367 [2024-07-15 11:54:23.863355] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:13:10.368 [2024-07-15 11:54:23.863421] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:10.671 [2024-07-15 11:54:23.994269] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:10.671 [2024-07-15 11:54:24.101095] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:10.671 [2024-07-15 11:54:24.169405] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:10.671 [2024-07-15 11:54:24.169453] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:11.260 11:54:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:11.260 11:54:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:11.260 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:11.519 [2024-07-15 11:54:24.960089] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:11.519 [2024-07-15 11:54:24.960129] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:11.519 [2024-07-15 11:54:24.960140] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:11.519 [2024-07-15 11:54:24.960151] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.519 11:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:11.807 11:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.807 "name": "Existed_Raid", 00:13:11.807 "uuid": "29b00081-0aad-4185-a600-a5a507a8bc3c", 00:13:11.807 "strip_size_kb": 64, 00:13:11.807 "state": "configuring", 00:13:11.807 "raid_level": "concat", 00:13:11.807 "superblock": true, 00:13:11.807 "num_base_bdevs": 2, 00:13:11.807 "num_base_bdevs_discovered": 0, 00:13:11.807 "num_base_bdevs_operational": 2, 00:13:11.807 "base_bdevs_list": [ 00:13:11.807 { 00:13:11.807 "name": "BaseBdev1", 00:13:11.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.807 "is_configured": false, 00:13:11.807 "data_offset": 0, 00:13:11.807 "data_size": 0 00:13:11.807 }, 00:13:11.807 { 00:13:11.807 "name": "BaseBdev2", 00:13:11.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:11.807 "is_configured": false, 00:13:11.807 "data_offset": 0, 00:13:11.807 "data_size": 0 00:13:11.807 } 00:13:11.807 ] 00:13:11.807 }' 00:13:11.807 11:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.807 11:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:12.374 11:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:12.632 [2024-07-15 11:54:26.046803] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:12.632 [2024-07-15 11:54:26.046831] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfdbb00 name Existed_Raid, state configuring 00:13:12.632 11:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:12.890 [2024-07-15 11:54:26.299495] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:12.890 [2024-07-15 11:54:26.299520] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:12.890 [2024-07-15 11:54:26.299529] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:12.890 [2024-07-15 11:54:26.299541] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:12.891 11:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:13.149 [2024-07-15 11:54:26.558013] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:13.149 BaseBdev1 00:13:13.149 11:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:13.149 11:54:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:13.149 11:54:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:13.149 11:54:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:13.149 11:54:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:13.149 11:54:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:13.149 11:54:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:13.407 11:54:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:13.666 [ 00:13:13.666 { 00:13:13.666 "name": "BaseBdev1", 00:13:13.666 "aliases": [ 00:13:13.666 "757951d8-274d-465e-a492-ec3113195833" 00:13:13.666 ], 00:13:13.666 "product_name": "Malloc disk", 00:13:13.666 "block_size": 512, 00:13:13.666 "num_blocks": 65536, 00:13:13.666 "uuid": "757951d8-274d-465e-a492-ec3113195833", 00:13:13.666 "assigned_rate_limits": { 00:13:13.666 "rw_ios_per_sec": 0, 00:13:13.666 "rw_mbytes_per_sec": 0, 00:13:13.666 "r_mbytes_per_sec": 0, 00:13:13.666 "w_mbytes_per_sec": 0 00:13:13.666 }, 00:13:13.666 "claimed": true, 00:13:13.666 "claim_type": "exclusive_write", 00:13:13.666 "zoned": false, 00:13:13.666 "supported_io_types": { 00:13:13.666 "read": true, 00:13:13.666 "write": true, 00:13:13.666 "unmap": true, 00:13:13.666 "flush": true, 00:13:13.666 "reset": true, 00:13:13.666 "nvme_admin": false, 00:13:13.666 "nvme_io": false, 00:13:13.666 "nvme_io_md": false, 00:13:13.666 "write_zeroes": true, 00:13:13.666 "zcopy": true, 00:13:13.666 "get_zone_info": false, 00:13:13.666 "zone_management": false, 00:13:13.666 "zone_append": false, 00:13:13.666 "compare": false, 00:13:13.666 "compare_and_write": false, 00:13:13.666 "abort": true, 00:13:13.666 "seek_hole": false, 00:13:13.666 "seek_data": false, 00:13:13.666 "copy": true, 00:13:13.666 "nvme_iov_md": false 00:13:13.666 }, 00:13:13.666 "memory_domains": [ 00:13:13.666 { 00:13:13.666 "dma_device_id": "system", 00:13:13.666 "dma_device_type": 1 00:13:13.666 }, 00:13:13.666 { 00:13:13.666 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:13.666 "dma_device_type": 2 00:13:13.666 } 00:13:13.666 ], 00:13:13.666 "driver_specific": {} 00:13:13.666 } 00:13:13.666 ] 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:13.666 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.925 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.925 "name": "Existed_Raid", 00:13:13.925 "uuid": "c4e6571e-6938-4c59-b093-ed9d9f06b660", 00:13:13.925 "strip_size_kb": 64, 00:13:13.925 "state": "configuring", 00:13:13.925 "raid_level": "concat", 00:13:13.925 "superblock": true, 00:13:13.925 "num_base_bdevs": 2, 00:13:13.925 "num_base_bdevs_discovered": 1, 00:13:13.925 "num_base_bdevs_operational": 2, 00:13:13.925 "base_bdevs_list": [ 00:13:13.925 { 00:13:13.925 "name": "BaseBdev1", 00:13:13.925 "uuid": "757951d8-274d-465e-a492-ec3113195833", 00:13:13.925 "is_configured": true, 00:13:13.925 "data_offset": 2048, 00:13:13.925 "data_size": 63488 00:13:13.925 }, 00:13:13.925 { 00:13:13.925 "name": "BaseBdev2", 00:13:13.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:13.925 "is_configured": false, 00:13:13.925 "data_offset": 0, 00:13:13.925 "data_size": 0 00:13:13.925 } 00:13:13.925 ] 00:13:13.925 }' 00:13:13.925 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.925 11:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:14.494 11:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:14.494 [2024-07-15 11:54:28.070070] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:14.494 [2024-07-15 11:54:28.070105] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfdb3d0 name Existed_Raid, state configuring 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:14.754 [2024-07-15 11:54:28.318769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:14.754 [2024-07-15 11:54:28.320232] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:14.754 [2024-07-15 11:54:28.320268] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:14.754 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:15.013 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:15.013 "name": "Existed_Raid", 00:13:15.013 "uuid": "276625c0-dcff-43a0-b679-b6f63778b4ac", 00:13:15.013 "strip_size_kb": 64, 00:13:15.013 "state": "configuring", 00:13:15.013 "raid_level": "concat", 00:13:15.013 "superblock": true, 00:13:15.013 "num_base_bdevs": 2, 00:13:15.013 "num_base_bdevs_discovered": 1, 00:13:15.013 "num_base_bdevs_operational": 2, 00:13:15.013 "base_bdevs_list": [ 00:13:15.013 { 00:13:15.013 "name": "BaseBdev1", 00:13:15.013 "uuid": "757951d8-274d-465e-a492-ec3113195833", 00:13:15.013 "is_configured": true, 00:13:15.013 "data_offset": 2048, 00:13:15.013 "data_size": 63488 00:13:15.013 }, 00:13:15.013 { 00:13:15.013 "name": "BaseBdev2", 00:13:15.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:15.013 "is_configured": false, 00:13:15.013 "data_offset": 0, 00:13:15.013 "data_size": 0 00:13:15.013 } 00:13:15.013 ] 00:13:15.013 }' 00:13:15.013 11:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:15.013 11:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:15.951 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:15.951 [2024-07-15 11:54:29.437728] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:15.951 [2024-07-15 11:54:29.437876] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xfdc150 00:13:15.951 [2024-07-15 11:54:29.437890] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:15.951 [2024-07-15 11:54:29.438062] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xef6420 00:13:15.951 [2024-07-15 11:54:29.438176] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfdc150 00:13:15.951 [2024-07-15 11:54:29.438186] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xfdc150 00:13:15.951 [2024-07-15 11:54:29.438275] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:15.951 BaseBdev2 00:13:15.951 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:15.951 11:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:15.951 11:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:15.951 11:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:13:15.951 11:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:15.951 11:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:15.951 11:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:16.211 11:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:16.470 [ 00:13:16.470 { 00:13:16.470 "name": "BaseBdev2", 00:13:16.470 "aliases": [ 00:13:16.470 "53aced08-3e04-44b4-97ff-9ae04a3f6217" 00:13:16.470 ], 00:13:16.470 "product_name": "Malloc disk", 00:13:16.470 "block_size": 512, 00:13:16.470 "num_blocks": 65536, 00:13:16.470 "uuid": "53aced08-3e04-44b4-97ff-9ae04a3f6217", 00:13:16.470 "assigned_rate_limits": { 00:13:16.470 "rw_ios_per_sec": 0, 00:13:16.470 "rw_mbytes_per_sec": 0, 00:13:16.470 "r_mbytes_per_sec": 0, 00:13:16.470 "w_mbytes_per_sec": 0 00:13:16.470 }, 00:13:16.470 "claimed": true, 00:13:16.470 "claim_type": "exclusive_write", 00:13:16.470 "zoned": false, 00:13:16.470 "supported_io_types": { 00:13:16.470 "read": true, 00:13:16.470 "write": true, 00:13:16.470 "unmap": true, 00:13:16.470 "flush": true, 00:13:16.470 "reset": true, 00:13:16.470 "nvme_admin": false, 00:13:16.470 "nvme_io": false, 00:13:16.470 "nvme_io_md": false, 00:13:16.470 "write_zeroes": true, 00:13:16.470 "zcopy": true, 00:13:16.470 "get_zone_info": false, 00:13:16.470 "zone_management": false, 00:13:16.470 "zone_append": false, 00:13:16.470 "compare": false, 00:13:16.470 "compare_and_write": false, 00:13:16.470 "abort": true, 00:13:16.470 "seek_hole": false, 00:13:16.470 "seek_data": false, 00:13:16.470 "copy": true, 00:13:16.471 "nvme_iov_md": false 00:13:16.471 }, 00:13:16.471 "memory_domains": [ 00:13:16.471 { 00:13:16.471 "dma_device_id": "system", 00:13:16.471 "dma_device_type": 1 00:13:16.471 }, 00:13:16.471 { 00:13:16.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:16.471 "dma_device_type": 2 00:13:16.471 } 00:13:16.471 ], 00:13:16.471 "driver_specific": {} 00:13:16.471 } 00:13:16.471 ] 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:16.471 11:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:16.729 11:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:16.729 "name": "Existed_Raid", 00:13:16.729 "uuid": "276625c0-dcff-43a0-b679-b6f63778b4ac", 00:13:16.729 "strip_size_kb": 64, 00:13:16.729 "state": "online", 00:13:16.729 "raid_level": "concat", 00:13:16.729 "superblock": true, 00:13:16.729 "num_base_bdevs": 2, 00:13:16.729 "num_base_bdevs_discovered": 2, 00:13:16.729 "num_base_bdevs_operational": 2, 00:13:16.729 "base_bdevs_list": [ 00:13:16.729 { 00:13:16.729 "name": "BaseBdev1", 00:13:16.729 "uuid": "757951d8-274d-465e-a492-ec3113195833", 00:13:16.729 "is_configured": true, 00:13:16.729 "data_offset": 2048, 00:13:16.729 "data_size": 63488 00:13:16.729 }, 00:13:16.729 { 00:13:16.729 "name": "BaseBdev2", 00:13:16.729 "uuid": "53aced08-3e04-44b4-97ff-9ae04a3f6217", 00:13:16.729 "is_configured": true, 00:13:16.729 "data_offset": 2048, 00:13:16.729 "data_size": 63488 00:13:16.729 } 00:13:16.729 ] 00:13:16.729 }' 00:13:16.729 11:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:16.729 11:54:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:17.664 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:17.664 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:17.664 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:17.664 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:17.664 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:17.664 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:17.664 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:17.664 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:17.923 [2024-07-15 11:54:31.282877] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:17.923 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:17.923 "name": "Existed_Raid", 00:13:17.923 "aliases": [ 00:13:17.923 "276625c0-dcff-43a0-b679-b6f63778b4ac" 00:13:17.923 ], 00:13:17.923 "product_name": "Raid Volume", 00:13:17.923 "block_size": 512, 00:13:17.923 "num_blocks": 126976, 00:13:17.923 "uuid": "276625c0-dcff-43a0-b679-b6f63778b4ac", 00:13:17.923 "assigned_rate_limits": { 00:13:17.923 "rw_ios_per_sec": 0, 00:13:17.923 "rw_mbytes_per_sec": 0, 00:13:17.923 "r_mbytes_per_sec": 0, 00:13:17.923 "w_mbytes_per_sec": 0 00:13:17.923 }, 00:13:17.923 "claimed": false, 00:13:17.923 "zoned": false, 00:13:17.923 "supported_io_types": { 00:13:17.923 "read": true, 00:13:17.923 "write": true, 00:13:17.923 "unmap": true, 00:13:17.923 "flush": true, 00:13:17.923 "reset": true, 00:13:17.923 "nvme_admin": false, 00:13:17.923 "nvme_io": false, 00:13:17.923 "nvme_io_md": false, 00:13:17.923 "write_zeroes": true, 00:13:17.923 "zcopy": false, 00:13:17.923 "get_zone_info": false, 00:13:17.923 "zone_management": false, 00:13:17.923 "zone_append": false, 00:13:17.923 "compare": false, 00:13:17.923 "compare_and_write": false, 00:13:17.923 "abort": false, 00:13:17.923 "seek_hole": false, 00:13:17.923 "seek_data": false, 00:13:17.923 "copy": false, 00:13:17.923 "nvme_iov_md": false 00:13:17.923 }, 00:13:17.923 "memory_domains": [ 00:13:17.923 { 00:13:17.923 "dma_device_id": "system", 00:13:17.923 "dma_device_type": 1 00:13:17.923 }, 00:13:17.923 { 00:13:17.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.923 "dma_device_type": 2 00:13:17.923 }, 00:13:17.923 { 00:13:17.923 "dma_device_id": "system", 00:13:17.923 "dma_device_type": 1 00:13:17.923 }, 00:13:17.923 { 00:13:17.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:17.923 "dma_device_type": 2 00:13:17.923 } 00:13:17.923 ], 00:13:17.923 "driver_specific": { 00:13:17.923 "raid": { 00:13:17.923 "uuid": "276625c0-dcff-43a0-b679-b6f63778b4ac", 00:13:17.923 "strip_size_kb": 64, 00:13:17.923 "state": "online", 00:13:17.923 "raid_level": "concat", 00:13:17.923 "superblock": true, 00:13:17.923 "num_base_bdevs": 2, 00:13:17.923 "num_base_bdevs_discovered": 2, 00:13:17.923 "num_base_bdevs_operational": 2, 00:13:17.923 "base_bdevs_list": [ 00:13:17.923 { 00:13:17.923 "name": "BaseBdev1", 00:13:17.923 "uuid": "757951d8-274d-465e-a492-ec3113195833", 00:13:17.923 "is_configured": true, 00:13:17.923 "data_offset": 2048, 00:13:17.923 "data_size": 63488 00:13:17.923 }, 00:13:17.923 { 00:13:17.923 "name": "BaseBdev2", 00:13:17.923 "uuid": "53aced08-3e04-44b4-97ff-9ae04a3f6217", 00:13:17.923 "is_configured": true, 00:13:17.923 "data_offset": 2048, 00:13:17.923 "data_size": 63488 00:13:17.923 } 00:13:17.923 ] 00:13:17.923 } 00:13:17.923 } 00:13:17.923 }' 00:13:17.923 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:17.923 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:17.923 BaseBdev2' 00:13:17.923 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:17.923 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:17.923 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.182 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.182 "name": "BaseBdev1", 00:13:18.182 "aliases": [ 00:13:18.182 "757951d8-274d-465e-a492-ec3113195833" 00:13:18.182 ], 00:13:18.182 "product_name": "Malloc disk", 00:13:18.182 "block_size": 512, 00:13:18.182 "num_blocks": 65536, 00:13:18.182 "uuid": "757951d8-274d-465e-a492-ec3113195833", 00:13:18.182 "assigned_rate_limits": { 00:13:18.182 "rw_ios_per_sec": 0, 00:13:18.182 "rw_mbytes_per_sec": 0, 00:13:18.182 "r_mbytes_per_sec": 0, 00:13:18.182 "w_mbytes_per_sec": 0 00:13:18.182 }, 00:13:18.182 "claimed": true, 00:13:18.182 "claim_type": "exclusive_write", 00:13:18.182 "zoned": false, 00:13:18.182 "supported_io_types": { 00:13:18.182 "read": true, 00:13:18.182 "write": true, 00:13:18.182 "unmap": true, 00:13:18.182 "flush": true, 00:13:18.182 "reset": true, 00:13:18.182 "nvme_admin": false, 00:13:18.182 "nvme_io": false, 00:13:18.182 "nvme_io_md": false, 00:13:18.182 "write_zeroes": true, 00:13:18.182 "zcopy": true, 00:13:18.182 "get_zone_info": false, 00:13:18.182 "zone_management": false, 00:13:18.182 "zone_append": false, 00:13:18.182 "compare": false, 00:13:18.182 "compare_and_write": false, 00:13:18.182 "abort": true, 00:13:18.182 "seek_hole": false, 00:13:18.182 "seek_data": false, 00:13:18.182 "copy": true, 00:13:18.182 "nvme_iov_md": false 00:13:18.182 }, 00:13:18.182 "memory_domains": [ 00:13:18.182 { 00:13:18.182 "dma_device_id": "system", 00:13:18.182 "dma_device_type": 1 00:13:18.182 }, 00:13:18.182 { 00:13:18.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.182 "dma_device_type": 2 00:13:18.182 } 00:13:18.182 ], 00:13:18.182 "driver_specific": {} 00:13:18.182 }' 00:13:18.182 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.182 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.182 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:18.182 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.182 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:18.441 11:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:18.700 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:18.700 "name": "BaseBdev2", 00:13:18.700 "aliases": [ 00:13:18.700 "53aced08-3e04-44b4-97ff-9ae04a3f6217" 00:13:18.700 ], 00:13:18.700 "product_name": "Malloc disk", 00:13:18.700 "block_size": 512, 00:13:18.700 "num_blocks": 65536, 00:13:18.700 "uuid": "53aced08-3e04-44b4-97ff-9ae04a3f6217", 00:13:18.700 "assigned_rate_limits": { 00:13:18.700 "rw_ios_per_sec": 0, 00:13:18.700 "rw_mbytes_per_sec": 0, 00:13:18.700 "r_mbytes_per_sec": 0, 00:13:18.700 "w_mbytes_per_sec": 0 00:13:18.700 }, 00:13:18.700 "claimed": true, 00:13:18.700 "claim_type": "exclusive_write", 00:13:18.700 "zoned": false, 00:13:18.700 "supported_io_types": { 00:13:18.700 "read": true, 00:13:18.700 "write": true, 00:13:18.700 "unmap": true, 00:13:18.700 "flush": true, 00:13:18.700 "reset": true, 00:13:18.700 "nvme_admin": false, 00:13:18.700 "nvme_io": false, 00:13:18.700 "nvme_io_md": false, 00:13:18.700 "write_zeroes": true, 00:13:18.700 "zcopy": true, 00:13:18.700 "get_zone_info": false, 00:13:18.700 "zone_management": false, 00:13:18.700 "zone_append": false, 00:13:18.700 "compare": false, 00:13:18.700 "compare_and_write": false, 00:13:18.701 "abort": true, 00:13:18.701 "seek_hole": false, 00:13:18.701 "seek_data": false, 00:13:18.701 "copy": true, 00:13:18.701 "nvme_iov_md": false 00:13:18.701 }, 00:13:18.701 "memory_domains": [ 00:13:18.701 { 00:13:18.701 "dma_device_id": "system", 00:13:18.701 "dma_device_type": 1 00:13:18.701 }, 00:13:18.701 { 00:13:18.701 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:18.701 "dma_device_type": 2 00:13:18.701 } 00:13:18.701 ], 00:13:18.701 "driver_specific": {} 00:13:18.701 }' 00:13:18.701 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.701 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:18.959 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:18.959 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.959 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:18.959 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:18.959 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.960 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:18.960 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:18.960 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:18.960 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:19.219 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:19.219 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:19.219 [2024-07-15 11:54:32.794836] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:19.219 [2024-07-15 11:54:32.794865] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:19.219 [2024-07-15 11:54:32.794905] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:19.478 11:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:19.478 11:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:19.478 "name": "Existed_Raid", 00:13:19.478 "uuid": "276625c0-dcff-43a0-b679-b6f63778b4ac", 00:13:19.478 "strip_size_kb": 64, 00:13:19.478 "state": "offline", 00:13:19.478 "raid_level": "concat", 00:13:19.478 "superblock": true, 00:13:19.478 "num_base_bdevs": 2, 00:13:19.478 "num_base_bdevs_discovered": 1, 00:13:19.478 "num_base_bdevs_operational": 1, 00:13:19.478 "base_bdevs_list": [ 00:13:19.478 { 00:13:19.478 "name": null, 00:13:19.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:19.478 "is_configured": false, 00:13:19.478 "data_offset": 2048, 00:13:19.478 "data_size": 63488 00:13:19.478 }, 00:13:19.478 { 00:13:19.478 "name": "BaseBdev2", 00:13:19.478 "uuid": "53aced08-3e04-44b4-97ff-9ae04a3f6217", 00:13:19.478 "is_configured": true, 00:13:19.478 "data_offset": 2048, 00:13:19.478 "data_size": 63488 00:13:19.478 } 00:13:19.478 ] 00:13:19.478 }' 00:13:19.478 11:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:19.478 11:54:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:20.415 11:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:20.415 11:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:20.415 11:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:20.415 11:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.415 11:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:20.415 11:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:20.415 11:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:20.674 [2024-07-15 11:54:34.083290] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:20.674 [2024-07-15 11:54:34.083342] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfdc150 name Existed_Raid, state offline 00:13:20.674 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:20.674 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:20.674 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.674 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1463137 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1463137 ']' 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1463137 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1463137 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1463137' 00:13:20.934 killing process with pid 1463137 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1463137 00:13:20.934 [2024-07-15 11:54:34.416854] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:20.934 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1463137 00:13:20.934 [2024-07-15 11:54:34.417712] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:21.193 11:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:13:21.193 00:13:21.193 real 0m10.829s 00:13:21.193 user 0m19.235s 00:13:21.193 sys 0m2.036s 00:13:21.193 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:21.193 11:54:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:21.193 ************************************ 00:13:21.193 END TEST raid_state_function_test_sb 00:13:21.193 ************************************ 00:13:21.193 11:54:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:21.193 11:54:34 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:13:21.193 11:54:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:13:21.193 11:54:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:21.193 11:54:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:21.193 ************************************ 00:13:21.193 START TEST raid_superblock_test 00:13:21.193 ************************************ 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 2 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1464786 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1464786 /var/tmp/spdk-raid.sock 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1464786 ']' 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:21.193 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:21.193 11:54:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.193 [2024-07-15 11:54:34.769163] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:13:21.193 [2024-07-15 11:54:34.769227] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1464786 ] 00:13:21.453 [2024-07-15 11:54:34.895914] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:21.453 [2024-07-15 11:54:35.004652] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:21.712 [2024-07-15 11:54:35.073098] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:21.712 [2024-07-15 11:54:35.073134] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:21.712 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:13:21.971 malloc1 00:13:21.971 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:22.230 [2024-07-15 11:54:35.802720] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:22.230 [2024-07-15 11:54:35.802769] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:22.230 [2024-07-15 11:54:35.802790] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x118c560 00:13:22.230 [2024-07-15 11:54:35.802803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:22.230 [2024-07-15 11:54:35.804417] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:22.230 [2024-07-15 11:54:35.804447] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:22.230 pt1 00:13:22.230 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:22.230 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:22.230 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:13:22.230 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:13:22.230 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:13:22.230 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:13:22.230 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:13:22.230 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:13:22.230 11:54:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:13:22.799 malloc2 00:13:22.799 11:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:23.059 [2024-07-15 11:54:36.613483] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:23.059 [2024-07-15 11:54:36.613532] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:23.059 [2024-07-15 11:54:36.613550] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x122a5b0 00:13:23.059 [2024-07-15 11:54:36.613563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:23.059 [2024-07-15 11:54:36.615113] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:23.059 [2024-07-15 11:54:36.615142] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:23.059 pt2 00:13:23.059 11:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:13:23.059 11:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:13:23.059 11:54:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:13:23.627 [2024-07-15 11:54:37.114809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:23.627 [2024-07-15 11:54:37.116167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:23.627 [2024-07-15 11:54:37.116313] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x122bdb0 00:13:23.627 [2024-07-15 11:54:37.116325] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:23.627 [2024-07-15 11:54:37.116522] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x122d280 00:13:23.627 [2024-07-15 11:54:37.116658] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x122bdb0 00:13:23.627 [2024-07-15 11:54:37.116669] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x122bdb0 00:13:23.627 [2024-07-15 11:54:37.116783] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:23.627 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:23.627 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:23.627 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:23.627 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:23.627 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:23.627 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:23.627 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:23.628 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:23.628 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:23.628 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:23.628 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.628 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:24.195 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.195 "name": "raid_bdev1", 00:13:24.195 "uuid": "c07a35b7-86e9-4c54-9678-c1025e8220cd", 00:13:24.195 "strip_size_kb": 64, 00:13:24.195 "state": "online", 00:13:24.195 "raid_level": "concat", 00:13:24.195 "superblock": true, 00:13:24.195 "num_base_bdevs": 2, 00:13:24.195 "num_base_bdevs_discovered": 2, 00:13:24.195 "num_base_bdevs_operational": 2, 00:13:24.195 "base_bdevs_list": [ 00:13:24.195 { 00:13:24.195 "name": "pt1", 00:13:24.195 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:24.195 "is_configured": true, 00:13:24.195 "data_offset": 2048, 00:13:24.195 "data_size": 63488 00:13:24.195 }, 00:13:24.195 { 00:13:24.195 "name": "pt2", 00:13:24.195 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:24.195 "is_configured": true, 00:13:24.195 "data_offset": 2048, 00:13:24.195 "data_size": 63488 00:13:24.195 } 00:13:24.195 ] 00:13:24.195 }' 00:13:24.195 11:54:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.195 11:54:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.133 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:13:25.133 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:25.133 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:25.133 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:25.133 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:25.133 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:25.133 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:25.133 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:25.393 [2024-07-15 11:54:38.743346] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:25.393 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:25.393 "name": "raid_bdev1", 00:13:25.393 "aliases": [ 00:13:25.393 "c07a35b7-86e9-4c54-9678-c1025e8220cd" 00:13:25.393 ], 00:13:25.393 "product_name": "Raid Volume", 00:13:25.393 "block_size": 512, 00:13:25.393 "num_blocks": 126976, 00:13:25.393 "uuid": "c07a35b7-86e9-4c54-9678-c1025e8220cd", 00:13:25.393 "assigned_rate_limits": { 00:13:25.393 "rw_ios_per_sec": 0, 00:13:25.393 "rw_mbytes_per_sec": 0, 00:13:25.393 "r_mbytes_per_sec": 0, 00:13:25.393 "w_mbytes_per_sec": 0 00:13:25.393 }, 00:13:25.393 "claimed": false, 00:13:25.393 "zoned": false, 00:13:25.393 "supported_io_types": { 00:13:25.393 "read": true, 00:13:25.393 "write": true, 00:13:25.393 "unmap": true, 00:13:25.393 "flush": true, 00:13:25.393 "reset": true, 00:13:25.393 "nvme_admin": false, 00:13:25.393 "nvme_io": false, 00:13:25.393 "nvme_io_md": false, 00:13:25.393 "write_zeroes": true, 00:13:25.393 "zcopy": false, 00:13:25.393 "get_zone_info": false, 00:13:25.393 "zone_management": false, 00:13:25.393 "zone_append": false, 00:13:25.393 "compare": false, 00:13:25.393 "compare_and_write": false, 00:13:25.393 "abort": false, 00:13:25.393 "seek_hole": false, 00:13:25.393 "seek_data": false, 00:13:25.393 "copy": false, 00:13:25.393 "nvme_iov_md": false 00:13:25.393 }, 00:13:25.393 "memory_domains": [ 00:13:25.393 { 00:13:25.393 "dma_device_id": "system", 00:13:25.393 "dma_device_type": 1 00:13:25.393 }, 00:13:25.393 { 00:13:25.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.393 "dma_device_type": 2 00:13:25.393 }, 00:13:25.393 { 00:13:25.393 "dma_device_id": "system", 00:13:25.393 "dma_device_type": 1 00:13:25.393 }, 00:13:25.393 { 00:13:25.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.393 "dma_device_type": 2 00:13:25.393 } 00:13:25.393 ], 00:13:25.393 "driver_specific": { 00:13:25.393 "raid": { 00:13:25.393 "uuid": "c07a35b7-86e9-4c54-9678-c1025e8220cd", 00:13:25.393 "strip_size_kb": 64, 00:13:25.393 "state": "online", 00:13:25.393 "raid_level": "concat", 00:13:25.393 "superblock": true, 00:13:25.393 "num_base_bdevs": 2, 00:13:25.393 "num_base_bdevs_discovered": 2, 00:13:25.393 "num_base_bdevs_operational": 2, 00:13:25.393 "base_bdevs_list": [ 00:13:25.393 { 00:13:25.393 "name": "pt1", 00:13:25.393 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:25.393 "is_configured": true, 00:13:25.393 "data_offset": 2048, 00:13:25.393 "data_size": 63488 00:13:25.393 }, 00:13:25.393 { 00:13:25.393 "name": "pt2", 00:13:25.393 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:25.393 "is_configured": true, 00:13:25.393 "data_offset": 2048, 00:13:25.393 "data_size": 63488 00:13:25.393 } 00:13:25.393 ] 00:13:25.393 } 00:13:25.393 } 00:13:25.393 }' 00:13:25.393 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:25.393 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:25.393 pt2' 00:13:25.393 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:25.393 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:25.393 11:54:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:25.652 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:25.652 "name": "pt1", 00:13:25.652 "aliases": [ 00:13:25.652 "00000000-0000-0000-0000-000000000001" 00:13:25.652 ], 00:13:25.652 "product_name": "passthru", 00:13:25.652 "block_size": 512, 00:13:25.652 "num_blocks": 65536, 00:13:25.652 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:25.652 "assigned_rate_limits": { 00:13:25.652 "rw_ios_per_sec": 0, 00:13:25.652 "rw_mbytes_per_sec": 0, 00:13:25.652 "r_mbytes_per_sec": 0, 00:13:25.652 "w_mbytes_per_sec": 0 00:13:25.652 }, 00:13:25.652 "claimed": true, 00:13:25.652 "claim_type": "exclusive_write", 00:13:25.652 "zoned": false, 00:13:25.652 "supported_io_types": { 00:13:25.652 "read": true, 00:13:25.652 "write": true, 00:13:25.652 "unmap": true, 00:13:25.652 "flush": true, 00:13:25.652 "reset": true, 00:13:25.652 "nvme_admin": false, 00:13:25.652 "nvme_io": false, 00:13:25.652 "nvme_io_md": false, 00:13:25.652 "write_zeroes": true, 00:13:25.652 "zcopy": true, 00:13:25.652 "get_zone_info": false, 00:13:25.652 "zone_management": false, 00:13:25.652 "zone_append": false, 00:13:25.652 "compare": false, 00:13:25.652 "compare_and_write": false, 00:13:25.652 "abort": true, 00:13:25.652 "seek_hole": false, 00:13:25.652 "seek_data": false, 00:13:25.652 "copy": true, 00:13:25.652 "nvme_iov_md": false 00:13:25.652 }, 00:13:25.652 "memory_domains": [ 00:13:25.652 { 00:13:25.652 "dma_device_id": "system", 00:13:25.652 "dma_device_type": 1 00:13:25.652 }, 00:13:25.652 { 00:13:25.652 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.652 "dma_device_type": 2 00:13:25.652 } 00:13:25.652 ], 00:13:25.652 "driver_specific": { 00:13:25.652 "passthru": { 00:13:25.652 "name": "pt1", 00:13:25.652 "base_bdev_name": "malloc1" 00:13:25.652 } 00:13:25.652 } 00:13:25.652 }' 00:13:25.652 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.652 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:25.652 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:25.652 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.652 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:25.912 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:26.171 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:26.171 "name": "pt2", 00:13:26.171 "aliases": [ 00:13:26.171 "00000000-0000-0000-0000-000000000002" 00:13:26.171 ], 00:13:26.171 "product_name": "passthru", 00:13:26.171 "block_size": 512, 00:13:26.171 "num_blocks": 65536, 00:13:26.171 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:26.171 "assigned_rate_limits": { 00:13:26.171 "rw_ios_per_sec": 0, 00:13:26.171 "rw_mbytes_per_sec": 0, 00:13:26.171 "r_mbytes_per_sec": 0, 00:13:26.171 "w_mbytes_per_sec": 0 00:13:26.171 }, 00:13:26.171 "claimed": true, 00:13:26.171 "claim_type": "exclusive_write", 00:13:26.171 "zoned": false, 00:13:26.171 "supported_io_types": { 00:13:26.171 "read": true, 00:13:26.171 "write": true, 00:13:26.171 "unmap": true, 00:13:26.171 "flush": true, 00:13:26.171 "reset": true, 00:13:26.171 "nvme_admin": false, 00:13:26.171 "nvme_io": false, 00:13:26.171 "nvme_io_md": false, 00:13:26.171 "write_zeroes": true, 00:13:26.171 "zcopy": true, 00:13:26.171 "get_zone_info": false, 00:13:26.171 "zone_management": false, 00:13:26.171 "zone_append": false, 00:13:26.171 "compare": false, 00:13:26.171 "compare_and_write": false, 00:13:26.171 "abort": true, 00:13:26.171 "seek_hole": false, 00:13:26.171 "seek_data": false, 00:13:26.171 "copy": true, 00:13:26.171 "nvme_iov_md": false 00:13:26.171 }, 00:13:26.171 "memory_domains": [ 00:13:26.171 { 00:13:26.171 "dma_device_id": "system", 00:13:26.171 "dma_device_type": 1 00:13:26.171 }, 00:13:26.171 { 00:13:26.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.171 "dma_device_type": 2 00:13:26.171 } 00:13:26.171 ], 00:13:26.171 "driver_specific": { 00:13:26.171 "passthru": { 00:13:26.171 "name": "pt2", 00:13:26.171 "base_bdev_name": "malloc2" 00:13:26.171 } 00:13:26.171 } 00:13:26.171 }' 00:13:26.171 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.171 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.171 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:26.171 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.171 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.431 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:26.431 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.431 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:26.431 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:26.431 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:26.431 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:26.431 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:26.431 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:26.431 11:54:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:13:26.691 [2024-07-15 11:54:40.191195] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:26.691 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c07a35b7-86e9-4c54-9678-c1025e8220cd 00:13:26.691 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c07a35b7-86e9-4c54-9678-c1025e8220cd ']' 00:13:26.691 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:26.951 [2024-07-15 11:54:40.443590] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:26.951 [2024-07-15 11:54:40.443613] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:26.951 [2024-07-15 11:54:40.443669] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:26.951 [2024-07-15 11:54:40.443721] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:26.951 [2024-07-15 11:54:40.443732] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x122bdb0 name raid_bdev1, state offline 00:13:26.951 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.951 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:13:27.209 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:13:27.209 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:13:27.209 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:27.209 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:27.469 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:13:27.469 11:54:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:27.729 11:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:27.729 11:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:27.988 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:13:28.247 [2024-07-15 11:54:41.658767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:28.247 [2024-07-15 11:54:41.660101] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:28.247 [2024-07-15 11:54:41.660155] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:28.247 [2024-07-15 11:54:41.660195] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:28.247 [2024-07-15 11:54:41.660214] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:28.247 [2024-07-15 11:54:41.660224] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x118fab0 name raid_bdev1, state configuring 00:13:28.247 request: 00:13:28.247 { 00:13:28.247 "name": "raid_bdev1", 00:13:28.247 "raid_level": "concat", 00:13:28.247 "base_bdevs": [ 00:13:28.247 "malloc1", 00:13:28.247 "malloc2" 00:13:28.247 ], 00:13:28.247 "strip_size_kb": 64, 00:13:28.247 "superblock": false, 00:13:28.247 "method": "bdev_raid_create", 00:13:28.247 "req_id": 1 00:13:28.247 } 00:13:28.247 Got JSON-RPC error response 00:13:28.247 response: 00:13:28.247 { 00:13:28.247 "code": -17, 00:13:28.247 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:28.247 } 00:13:28.247 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:13:28.247 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:13:28.247 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:13:28.247 11:54:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:13:28.247 11:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.247 11:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:28.507 11:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:28.507 11:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:28.507 11:54:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:28.767 [2024-07-15 11:54:42.151999] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:28.767 [2024-07-15 11:54:42.152043] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:28.767 [2024-07-15 11:54:42.152060] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x122cae0 00:13:28.767 [2024-07-15 11:54:42.152072] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:28.767 [2024-07-15 11:54:42.153695] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:28.767 [2024-07-15 11:54:42.153722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:28.767 [2024-07-15 11:54:42.153785] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:28.767 [2024-07-15 11:54:42.153809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:28.767 pt1 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.767 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:29.027 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.027 "name": "raid_bdev1", 00:13:29.027 "uuid": "c07a35b7-86e9-4c54-9678-c1025e8220cd", 00:13:29.027 "strip_size_kb": 64, 00:13:29.027 "state": "configuring", 00:13:29.027 "raid_level": "concat", 00:13:29.027 "superblock": true, 00:13:29.027 "num_base_bdevs": 2, 00:13:29.027 "num_base_bdevs_discovered": 1, 00:13:29.027 "num_base_bdevs_operational": 2, 00:13:29.027 "base_bdevs_list": [ 00:13:29.027 { 00:13:29.027 "name": "pt1", 00:13:29.027 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:29.027 "is_configured": true, 00:13:29.027 "data_offset": 2048, 00:13:29.027 "data_size": 63488 00:13:29.027 }, 00:13:29.027 { 00:13:29.027 "name": null, 00:13:29.027 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:29.027 "is_configured": false, 00:13:29.027 "data_offset": 2048, 00:13:29.027 "data_size": 63488 00:13:29.027 } 00:13:29.027 ] 00:13:29.027 }' 00:13:29.027 11:54:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.027 11:54:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.595 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:13:29.595 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:29.595 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:29.595 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:29.866 [2024-07-15 11:54:43.238899] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:29.866 [2024-07-15 11:54:43.238947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:29.866 [2024-07-15 11:54:43.238966] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x122d9d0 00:13:29.866 [2024-07-15 11:54:43.238978] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:29.866 [2024-07-15 11:54:43.239305] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:29.866 [2024-07-15 11:54:43.239322] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:29.866 [2024-07-15 11:54:43.239381] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:29.866 [2024-07-15 11:54:43.239398] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:29.866 [2024-07-15 11:54:43.239491] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x118e2c0 00:13:29.866 [2024-07-15 11:54:43.239502] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:29.866 [2024-07-15 11:54:43.239664] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x118b310 00:13:29.866 [2024-07-15 11:54:43.239793] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x118e2c0 00:13:29.866 [2024-07-15 11:54:43.239804] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x118e2c0 00:13:29.866 [2024-07-15 11:54:43.239907] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:29.866 pt2 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.866 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:30.128 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:30.128 "name": "raid_bdev1", 00:13:30.128 "uuid": "c07a35b7-86e9-4c54-9678-c1025e8220cd", 00:13:30.128 "strip_size_kb": 64, 00:13:30.128 "state": "online", 00:13:30.128 "raid_level": "concat", 00:13:30.128 "superblock": true, 00:13:30.128 "num_base_bdevs": 2, 00:13:30.128 "num_base_bdevs_discovered": 2, 00:13:30.128 "num_base_bdevs_operational": 2, 00:13:30.128 "base_bdevs_list": [ 00:13:30.128 { 00:13:30.128 "name": "pt1", 00:13:30.128 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:30.128 "is_configured": true, 00:13:30.128 "data_offset": 2048, 00:13:30.128 "data_size": 63488 00:13:30.128 }, 00:13:30.128 { 00:13:30.128 "name": "pt2", 00:13:30.128 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:30.128 "is_configured": true, 00:13:30.128 "data_offset": 2048, 00:13:30.128 "data_size": 63488 00:13:30.128 } 00:13:30.128 ] 00:13:30.128 }' 00:13:30.128 11:54:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:30.128 11:54:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.696 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:30.696 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:30.696 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:30.696 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:30.696 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:30.696 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:30.696 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:30.696 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:30.956 [2024-07-15 11:54:44.317992] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:30.956 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:30.956 "name": "raid_bdev1", 00:13:30.956 "aliases": [ 00:13:30.956 "c07a35b7-86e9-4c54-9678-c1025e8220cd" 00:13:30.956 ], 00:13:30.956 "product_name": "Raid Volume", 00:13:30.956 "block_size": 512, 00:13:30.956 "num_blocks": 126976, 00:13:30.956 "uuid": "c07a35b7-86e9-4c54-9678-c1025e8220cd", 00:13:30.956 "assigned_rate_limits": { 00:13:30.956 "rw_ios_per_sec": 0, 00:13:30.956 "rw_mbytes_per_sec": 0, 00:13:30.956 "r_mbytes_per_sec": 0, 00:13:30.956 "w_mbytes_per_sec": 0 00:13:30.956 }, 00:13:30.956 "claimed": false, 00:13:30.956 "zoned": false, 00:13:30.956 "supported_io_types": { 00:13:30.956 "read": true, 00:13:30.956 "write": true, 00:13:30.956 "unmap": true, 00:13:30.956 "flush": true, 00:13:30.956 "reset": true, 00:13:30.956 "nvme_admin": false, 00:13:30.956 "nvme_io": false, 00:13:30.956 "nvme_io_md": false, 00:13:30.956 "write_zeroes": true, 00:13:30.956 "zcopy": false, 00:13:30.956 "get_zone_info": false, 00:13:30.956 "zone_management": false, 00:13:30.956 "zone_append": false, 00:13:30.956 "compare": false, 00:13:30.956 "compare_and_write": false, 00:13:30.956 "abort": false, 00:13:30.956 "seek_hole": false, 00:13:30.956 "seek_data": false, 00:13:30.956 "copy": false, 00:13:30.956 "nvme_iov_md": false 00:13:30.956 }, 00:13:30.956 "memory_domains": [ 00:13:30.956 { 00:13:30.956 "dma_device_id": "system", 00:13:30.956 "dma_device_type": 1 00:13:30.956 }, 00:13:30.956 { 00:13:30.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.956 "dma_device_type": 2 00:13:30.956 }, 00:13:30.956 { 00:13:30.956 "dma_device_id": "system", 00:13:30.956 "dma_device_type": 1 00:13:30.956 }, 00:13:30.956 { 00:13:30.956 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.956 "dma_device_type": 2 00:13:30.956 } 00:13:30.956 ], 00:13:30.956 "driver_specific": { 00:13:30.956 "raid": { 00:13:30.956 "uuid": "c07a35b7-86e9-4c54-9678-c1025e8220cd", 00:13:30.956 "strip_size_kb": 64, 00:13:30.956 "state": "online", 00:13:30.956 "raid_level": "concat", 00:13:30.956 "superblock": true, 00:13:30.956 "num_base_bdevs": 2, 00:13:30.956 "num_base_bdevs_discovered": 2, 00:13:30.956 "num_base_bdevs_operational": 2, 00:13:30.956 "base_bdevs_list": [ 00:13:30.956 { 00:13:30.956 "name": "pt1", 00:13:30.956 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:30.956 "is_configured": true, 00:13:30.956 "data_offset": 2048, 00:13:30.956 "data_size": 63488 00:13:30.956 }, 00:13:30.956 { 00:13:30.956 "name": "pt2", 00:13:30.956 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:30.956 "is_configured": true, 00:13:30.956 "data_offset": 2048, 00:13:30.956 "data_size": 63488 00:13:30.956 } 00:13:30.956 ] 00:13:30.956 } 00:13:30.956 } 00:13:30.956 }' 00:13:30.956 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:30.956 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:30.956 pt2' 00:13:30.956 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.956 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:30.956 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:31.215 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:31.215 "name": "pt1", 00:13:31.215 "aliases": [ 00:13:31.215 "00000000-0000-0000-0000-000000000001" 00:13:31.215 ], 00:13:31.215 "product_name": "passthru", 00:13:31.215 "block_size": 512, 00:13:31.215 "num_blocks": 65536, 00:13:31.215 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:31.215 "assigned_rate_limits": { 00:13:31.215 "rw_ios_per_sec": 0, 00:13:31.215 "rw_mbytes_per_sec": 0, 00:13:31.215 "r_mbytes_per_sec": 0, 00:13:31.215 "w_mbytes_per_sec": 0 00:13:31.215 }, 00:13:31.215 "claimed": true, 00:13:31.216 "claim_type": "exclusive_write", 00:13:31.216 "zoned": false, 00:13:31.216 "supported_io_types": { 00:13:31.216 "read": true, 00:13:31.216 "write": true, 00:13:31.216 "unmap": true, 00:13:31.216 "flush": true, 00:13:31.216 "reset": true, 00:13:31.216 "nvme_admin": false, 00:13:31.216 "nvme_io": false, 00:13:31.216 "nvme_io_md": false, 00:13:31.216 "write_zeroes": true, 00:13:31.216 "zcopy": true, 00:13:31.216 "get_zone_info": false, 00:13:31.216 "zone_management": false, 00:13:31.216 "zone_append": false, 00:13:31.216 "compare": false, 00:13:31.216 "compare_and_write": false, 00:13:31.216 "abort": true, 00:13:31.216 "seek_hole": false, 00:13:31.216 "seek_data": false, 00:13:31.216 "copy": true, 00:13:31.216 "nvme_iov_md": false 00:13:31.216 }, 00:13:31.216 "memory_domains": [ 00:13:31.216 { 00:13:31.216 "dma_device_id": "system", 00:13:31.216 "dma_device_type": 1 00:13:31.216 }, 00:13:31.216 { 00:13:31.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.216 "dma_device_type": 2 00:13:31.216 } 00:13:31.216 ], 00:13:31.216 "driver_specific": { 00:13:31.216 "passthru": { 00:13:31.216 "name": "pt1", 00:13:31.216 "base_bdev_name": "malloc1" 00:13:31.216 } 00:13:31.216 } 00:13:31.216 }' 00:13:31.216 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.216 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.216 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:31.216 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.216 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:31.478 11:54:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:31.738 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:31.738 "name": "pt2", 00:13:31.738 "aliases": [ 00:13:31.738 "00000000-0000-0000-0000-000000000002" 00:13:31.738 ], 00:13:31.738 "product_name": "passthru", 00:13:31.738 "block_size": 512, 00:13:31.738 "num_blocks": 65536, 00:13:31.738 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:31.738 "assigned_rate_limits": { 00:13:31.738 "rw_ios_per_sec": 0, 00:13:31.738 "rw_mbytes_per_sec": 0, 00:13:31.738 "r_mbytes_per_sec": 0, 00:13:31.738 "w_mbytes_per_sec": 0 00:13:31.738 }, 00:13:31.738 "claimed": true, 00:13:31.738 "claim_type": "exclusive_write", 00:13:31.738 "zoned": false, 00:13:31.738 "supported_io_types": { 00:13:31.738 "read": true, 00:13:31.738 "write": true, 00:13:31.738 "unmap": true, 00:13:31.738 "flush": true, 00:13:31.738 "reset": true, 00:13:31.738 "nvme_admin": false, 00:13:31.738 "nvme_io": false, 00:13:31.738 "nvme_io_md": false, 00:13:31.738 "write_zeroes": true, 00:13:31.738 "zcopy": true, 00:13:31.738 "get_zone_info": false, 00:13:31.738 "zone_management": false, 00:13:31.738 "zone_append": false, 00:13:31.738 "compare": false, 00:13:31.738 "compare_and_write": false, 00:13:31.738 "abort": true, 00:13:31.738 "seek_hole": false, 00:13:31.738 "seek_data": false, 00:13:31.738 "copy": true, 00:13:31.738 "nvme_iov_md": false 00:13:31.738 }, 00:13:31.738 "memory_domains": [ 00:13:31.738 { 00:13:31.738 "dma_device_id": "system", 00:13:31.738 "dma_device_type": 1 00:13:31.738 }, 00:13:31.738 { 00:13:31.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.738 "dma_device_type": 2 00:13:31.738 } 00:13:31.738 ], 00:13:31.738 "driver_specific": { 00:13:31.738 "passthru": { 00:13:31.738 "name": "pt2", 00:13:31.738 "base_bdev_name": "malloc2" 00:13:31.738 } 00:13:31.738 } 00:13:31.738 }' 00:13:31.738 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.738 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.997 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:31.997 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.997 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.997 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:31.997 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.997 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.997 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:31.997 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.254 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.254 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:32.255 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:32.255 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:32.511 [2024-07-15 11:54:45.918219] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c07a35b7-86e9-4c54-9678-c1025e8220cd '!=' c07a35b7-86e9-4c54-9678-c1025e8220cd ']' 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1464786 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1464786 ']' 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1464786 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1464786 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1464786' 00:13:32.511 killing process with pid 1464786 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1464786 00:13:32.511 [2024-07-15 11:54:45.990729] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:32.511 [2024-07-15 11:54:45.990785] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:32.511 [2024-07-15 11:54:45.990829] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:32.511 [2024-07-15 11:54:45.990847] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x118e2c0 name raid_bdev1, state offline 00:13:32.511 11:54:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1464786 00:13:32.511 [2024-07-15 11:54:46.008482] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:32.777 11:54:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:32.777 00:13:32.777 real 0m11.509s 00:13:32.777 user 0m21.108s 00:13:32.777 sys 0m2.128s 00:13:32.777 11:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:32.777 11:54:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.778 ************************************ 00:13:32.778 END TEST raid_superblock_test 00:13:32.778 ************************************ 00:13:32.778 11:54:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:32.778 11:54:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:13:32.778 11:54:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:32.778 11:54:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:32.778 11:54:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:32.778 ************************************ 00:13:32.778 START TEST raid_read_error_test 00:13:32.778 ************************************ 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 read 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ZEbtq9zpau 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1466428 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1466428 /var/tmp/spdk-raid.sock 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1466428 ']' 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:32.778 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:32.778 11:54:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:33.044 [2024-07-15 11:54:46.373521] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:13:33.044 [2024-07-15 11:54:46.373592] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1466428 ] 00:13:33.044 [2024-07-15 11:54:46.503979] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:33.044 [2024-07-15 11:54:46.607770] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:33.301 [2024-07-15 11:54:46.684492] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:33.301 [2024-07-15 11:54:46.684530] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:33.879 11:54:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:33.879 11:54:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:33.879 11:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:33.879 11:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:34.137 BaseBdev1_malloc 00:13:34.137 11:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:34.396 true 00:13:34.396 11:54:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:34.655 [2024-07-15 11:54:48.021033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:34.655 [2024-07-15 11:54:48.021078] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:34.655 [2024-07-15 11:54:48.021098] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16474e0 00:13:34.655 [2024-07-15 11:54:48.021111] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:34.655 [2024-07-15 11:54:48.022879] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:34.655 [2024-07-15 11:54:48.022908] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:34.655 BaseBdev1 00:13:34.655 11:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:34.655 11:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:34.914 BaseBdev2_malloc 00:13:34.914 11:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:35.173 true 00:13:35.173 11:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:35.173 [2024-07-15 11:54:48.757004] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:35.173 [2024-07-15 11:54:48.757049] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:35.173 [2024-07-15 11:54:48.757070] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x164c7b0 00:13:35.173 [2024-07-15 11:54:48.757089] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:35.173 [2024-07-15 11:54:48.758681] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:35.173 [2024-07-15 11:54:48.758714] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:35.173 BaseBdev2 00:13:35.457 11:54:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:35.457 [2024-07-15 11:54:48.997666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:35.457 [2024-07-15 11:54:48.999032] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:35.457 [2024-07-15 11:54:48.999213] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x164de10 00:13:35.457 [2024-07-15 11:54:48.999226] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:35.457 [2024-07-15 11:54:48.999424] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a23f0 00:13:35.457 [2024-07-15 11:54:48.999570] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x164de10 00:13:35.457 [2024-07-15 11:54:48.999580] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x164de10 00:13:35.457 [2024-07-15 11:54:48.999683] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.457 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:35.716 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.716 "name": "raid_bdev1", 00:13:35.716 "uuid": "e04b92f2-2202-409a-b140-ff51af934487", 00:13:35.716 "strip_size_kb": 64, 00:13:35.716 "state": "online", 00:13:35.716 "raid_level": "concat", 00:13:35.716 "superblock": true, 00:13:35.716 "num_base_bdevs": 2, 00:13:35.716 "num_base_bdevs_discovered": 2, 00:13:35.716 "num_base_bdevs_operational": 2, 00:13:35.716 "base_bdevs_list": [ 00:13:35.716 { 00:13:35.716 "name": "BaseBdev1", 00:13:35.716 "uuid": "a0a5010c-0346-5ad6-8295-f3c0062ec000", 00:13:35.716 "is_configured": true, 00:13:35.716 "data_offset": 2048, 00:13:35.716 "data_size": 63488 00:13:35.716 }, 00:13:35.716 { 00:13:35.716 "name": "BaseBdev2", 00:13:35.716 "uuid": "c29a476f-10d7-5d3f-bc9b-cd77bc72f9a8", 00:13:35.716 "is_configured": true, 00:13:35.716 "data_offset": 2048, 00:13:35.716 "data_size": 63488 00:13:35.716 } 00:13:35.716 ] 00:13:35.716 }' 00:13:35.716 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.716 11:54:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.284 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:36.284 11:54:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:36.543 [2024-07-15 11:54:49.948487] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1648ef0 00:13:37.478 11:54:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.478 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:38.045 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.045 "name": "raid_bdev1", 00:13:38.045 "uuid": "e04b92f2-2202-409a-b140-ff51af934487", 00:13:38.045 "strip_size_kb": 64, 00:13:38.045 "state": "online", 00:13:38.045 "raid_level": "concat", 00:13:38.045 "superblock": true, 00:13:38.045 "num_base_bdevs": 2, 00:13:38.045 "num_base_bdevs_discovered": 2, 00:13:38.045 "num_base_bdevs_operational": 2, 00:13:38.045 "base_bdevs_list": [ 00:13:38.045 { 00:13:38.045 "name": "BaseBdev1", 00:13:38.045 "uuid": "a0a5010c-0346-5ad6-8295-f3c0062ec000", 00:13:38.046 "is_configured": true, 00:13:38.046 "data_offset": 2048, 00:13:38.046 "data_size": 63488 00:13:38.046 }, 00:13:38.046 { 00:13:38.046 "name": "BaseBdev2", 00:13:38.046 "uuid": "c29a476f-10d7-5d3f-bc9b-cd77bc72f9a8", 00:13:38.046 "is_configured": true, 00:13:38.046 "data_offset": 2048, 00:13:38.046 "data_size": 63488 00:13:38.046 } 00:13:38.046 ] 00:13:38.046 }' 00:13:38.046 11:54:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.046 11:54:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.626 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:38.885 [2024-07-15 11:54:52.401494] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:38.885 [2024-07-15 11:54:52.401529] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:38.885 [2024-07-15 11:54:52.404724] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:38.885 [2024-07-15 11:54:52.404754] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:38.885 [2024-07-15 11:54:52.404782] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:38.885 [2024-07-15 11:54:52.404793] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x164de10 name raid_bdev1, state offline 00:13:38.885 0 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1466428 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1466428 ']' 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1466428 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1466428 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1466428' 00:13:38.885 killing process with pid 1466428 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1466428 00:13:38.885 [2024-07-15 11:54:52.470192] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:38.885 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1466428 00:13:39.145 [2024-07-15 11:54:52.482352] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ZEbtq9zpau 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.41 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.41 != \0\.\0\0 ]] 00:13:39.145 00:13:39.145 real 0m6.431s 00:13:39.145 user 0m10.097s 00:13:39.145 sys 0m1.114s 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:39.145 11:54:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.145 ************************************ 00:13:39.145 END TEST raid_read_error_test 00:13:39.145 ************************************ 00:13:39.404 11:54:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:39.404 11:54:52 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:13:39.404 11:54:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:39.404 11:54:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:39.404 11:54:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:39.404 ************************************ 00:13:39.404 START TEST raid_write_error_test 00:13:39.404 ************************************ 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 2 write 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Fu23AYOz47 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1467402 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1467402 /var/tmp/spdk-raid.sock 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1467402 ']' 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:39.404 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:39.404 11:54:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.404 [2024-07-15 11:54:52.893845] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:13:39.404 [2024-07-15 11:54:52.893915] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1467402 ] 00:13:39.663 [2024-07-15 11:54:53.026049] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:39.663 [2024-07-15 11:54:53.127969] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:39.663 [2024-07-15 11:54:53.186402] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:39.663 [2024-07-15 11:54:53.186444] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:40.598 11:54:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:40.598 11:54:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:13:40.598 11:54:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:40.598 11:54:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:40.598 BaseBdev1_malloc 00:13:40.598 11:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:40.856 true 00:13:40.857 11:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:41.115 [2024-07-15 11:54:54.562509] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:41.115 [2024-07-15 11:54:54.562554] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:41.115 [2024-07-15 11:54:54.562573] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23284e0 00:13:41.115 [2024-07-15 11:54:54.562585] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:41.115 [2024-07-15 11:54:54.564215] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:41.115 [2024-07-15 11:54:54.564244] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:41.115 BaseBdev1 00:13:41.115 11:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:41.115 11:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:41.374 BaseBdev2_malloc 00:13:41.374 11:54:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:41.633 true 00:13:41.633 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:41.892 [2024-07-15 11:54:55.309089] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:41.892 [2024-07-15 11:54:55.309137] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:41.892 [2024-07-15 11:54:55.309156] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x232d7b0 00:13:41.892 [2024-07-15 11:54:55.309169] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:41.892 [2024-07-15 11:54:55.310611] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:41.892 [2024-07-15 11:54:55.310638] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:41.892 BaseBdev2 00:13:41.892 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:42.150 [2024-07-15 11:54:55.557775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:42.150 [2024-07-15 11:54:55.558954] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:42.150 [2024-07-15 11:54:55.559136] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x232ee10 00:13:42.150 [2024-07-15 11:54:55.559149] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:13:42.150 [2024-07-15 11:54:55.559332] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21833f0 00:13:42.150 [2024-07-15 11:54:55.559475] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x232ee10 00:13:42.150 [2024-07-15 11:54:55.559485] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x232ee10 00:13:42.150 [2024-07-15 11:54:55.559580] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:42.151 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.410 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.410 "name": "raid_bdev1", 00:13:42.410 "uuid": "d5e8369a-5206-4f7b-b08a-e165216ddc70", 00:13:42.410 "strip_size_kb": 64, 00:13:42.410 "state": "online", 00:13:42.410 "raid_level": "concat", 00:13:42.410 "superblock": true, 00:13:42.410 "num_base_bdevs": 2, 00:13:42.410 "num_base_bdevs_discovered": 2, 00:13:42.410 "num_base_bdevs_operational": 2, 00:13:42.410 "base_bdevs_list": [ 00:13:42.410 { 00:13:42.410 "name": "BaseBdev1", 00:13:42.410 "uuid": "2af6f0b0-1e49-532f-acc8-b3e218a0fa30", 00:13:42.410 "is_configured": true, 00:13:42.410 "data_offset": 2048, 00:13:42.410 "data_size": 63488 00:13:42.410 }, 00:13:42.410 { 00:13:42.410 "name": "BaseBdev2", 00:13:42.410 "uuid": "450a7551-750b-56c8-a3ff-876d2f8413b1", 00:13:42.410 "is_configured": true, 00:13:42.410 "data_offset": 2048, 00:13:42.410 "data_size": 63488 00:13:42.410 } 00:13:42.410 ] 00:13:42.410 }' 00:13:42.410 11:54:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.410 11:54:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:42.978 11:54:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:42.978 11:54:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:42.978 [2024-07-15 11:54:56.548682] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2329ef0 00:13:43.915 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.174 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:44.433 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.433 "name": "raid_bdev1", 00:13:44.433 "uuid": "d5e8369a-5206-4f7b-b08a-e165216ddc70", 00:13:44.433 "strip_size_kb": 64, 00:13:44.433 "state": "online", 00:13:44.433 "raid_level": "concat", 00:13:44.433 "superblock": true, 00:13:44.433 "num_base_bdevs": 2, 00:13:44.433 "num_base_bdevs_discovered": 2, 00:13:44.433 "num_base_bdevs_operational": 2, 00:13:44.433 "base_bdevs_list": [ 00:13:44.433 { 00:13:44.433 "name": "BaseBdev1", 00:13:44.433 "uuid": "2af6f0b0-1e49-532f-acc8-b3e218a0fa30", 00:13:44.433 "is_configured": true, 00:13:44.433 "data_offset": 2048, 00:13:44.433 "data_size": 63488 00:13:44.433 }, 00:13:44.433 { 00:13:44.433 "name": "BaseBdev2", 00:13:44.433 "uuid": "450a7551-750b-56c8-a3ff-876d2f8413b1", 00:13:44.433 "is_configured": true, 00:13:44.433 "data_offset": 2048, 00:13:44.433 "data_size": 63488 00:13:44.433 } 00:13:44.433 ] 00:13:44.433 }' 00:13:44.433 11:54:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.433 11:54:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:45.367 [2024-07-15 11:54:58.838665] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:45.367 [2024-07-15 11:54:58.838717] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:45.367 [2024-07-15 11:54:58.841911] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:45.367 [2024-07-15 11:54:58.841943] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:45.367 [2024-07-15 11:54:58.841972] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:45.367 [2024-07-15 11:54:58.841983] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x232ee10 name raid_bdev1, state offline 00:13:45.367 0 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1467402 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1467402 ']' 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1467402 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1467402 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1467402' 00:13:45.367 killing process with pid 1467402 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1467402 00:13:45.367 [2024-07-15 11:54:58.922909] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:45.367 11:54:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1467402 00:13:45.367 [2024-07-15 11:54:58.933603] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Fu23AYOz47 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:13:45.625 00:13:45.625 real 0m6.355s 00:13:45.625 user 0m9.979s 00:13:45.625 sys 0m1.099s 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:45.625 11:54:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.626 ************************************ 00:13:45.626 END TEST raid_write_error_test 00:13:45.626 ************************************ 00:13:45.626 11:54:59 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:45.626 11:54:59 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:45.626 11:54:59 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:13:45.626 11:54:59 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:45.626 11:54:59 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:45.626 11:54:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:45.884 ************************************ 00:13:45.884 START TEST raid_state_function_test 00:13:45.884 ************************************ 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 false 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:45.884 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1468370 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1468370' 00:13:45.885 Process raid pid: 1468370 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1468370 /var/tmp/spdk-raid.sock 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1468370 ']' 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:45.885 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:45.885 11:54:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:45.885 [2024-07-15 11:54:59.327089] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:13:45.885 [2024-07-15 11:54:59.327160] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:45.885 [2024-07-15 11:54:59.451380] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:46.143 [2024-07-15 11:54:59.556506] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:46.143 [2024-07-15 11:54:59.610568] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:46.143 [2024-07-15 11:54:59.610595] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:46.711 11:55:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:46.711 11:55:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:13:46.711 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:46.971 [2024-07-15 11:55:00.561717] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:46.971 [2024-07-15 11:55:00.561761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:46.971 [2024-07-15 11:55:00.561772] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:46.971 [2024-07-15 11:55:00.561784] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.230 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.489 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.490 "name": "Existed_Raid", 00:13:47.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.490 "strip_size_kb": 0, 00:13:47.490 "state": "configuring", 00:13:47.490 "raid_level": "raid1", 00:13:47.490 "superblock": false, 00:13:47.490 "num_base_bdevs": 2, 00:13:47.490 "num_base_bdevs_discovered": 0, 00:13:47.490 "num_base_bdevs_operational": 2, 00:13:47.490 "base_bdevs_list": [ 00:13:47.490 { 00:13:47.490 "name": "BaseBdev1", 00:13:47.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.490 "is_configured": false, 00:13:47.490 "data_offset": 0, 00:13:47.490 "data_size": 0 00:13:47.490 }, 00:13:47.490 { 00:13:47.490 "name": "BaseBdev2", 00:13:47.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.490 "is_configured": false, 00:13:47.490 "data_offset": 0, 00:13:47.490 "data_size": 0 00:13:47.490 } 00:13:47.490 ] 00:13:47.490 }' 00:13:47.490 11:55:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.490 11:55:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.454 11:55:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:48.713 [2024-07-15 11:55:02.213908] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:48.713 [2024-07-15 11:55:02.213939] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1895b00 name Existed_Raid, state configuring 00:13:48.713 11:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:48.972 [2024-07-15 11:55:02.470596] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:48.972 [2024-07-15 11:55:02.470632] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:48.972 [2024-07-15 11:55:02.470642] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:48.972 [2024-07-15 11:55:02.470653] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:48.972 11:55:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:49.538 [2024-07-15 11:55:02.981679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:49.538 BaseBdev1 00:13:49.538 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:49.538 11:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:13:49.538 11:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:49.538 11:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:49.538 11:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:49.538 11:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:49.538 11:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:49.797 11:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:50.056 [ 00:13:50.056 { 00:13:50.056 "name": "BaseBdev1", 00:13:50.056 "aliases": [ 00:13:50.056 "57fe2723-b9c1-486a-98ee-7ec6c8704184" 00:13:50.056 ], 00:13:50.056 "product_name": "Malloc disk", 00:13:50.056 "block_size": 512, 00:13:50.056 "num_blocks": 65536, 00:13:50.056 "uuid": "57fe2723-b9c1-486a-98ee-7ec6c8704184", 00:13:50.056 "assigned_rate_limits": { 00:13:50.056 "rw_ios_per_sec": 0, 00:13:50.056 "rw_mbytes_per_sec": 0, 00:13:50.056 "r_mbytes_per_sec": 0, 00:13:50.056 "w_mbytes_per_sec": 0 00:13:50.056 }, 00:13:50.056 "claimed": true, 00:13:50.056 "claim_type": "exclusive_write", 00:13:50.056 "zoned": false, 00:13:50.056 "supported_io_types": { 00:13:50.056 "read": true, 00:13:50.056 "write": true, 00:13:50.056 "unmap": true, 00:13:50.056 "flush": true, 00:13:50.056 "reset": true, 00:13:50.056 "nvme_admin": false, 00:13:50.056 "nvme_io": false, 00:13:50.056 "nvme_io_md": false, 00:13:50.056 "write_zeroes": true, 00:13:50.056 "zcopy": true, 00:13:50.056 "get_zone_info": false, 00:13:50.056 "zone_management": false, 00:13:50.056 "zone_append": false, 00:13:50.056 "compare": false, 00:13:50.056 "compare_and_write": false, 00:13:50.056 "abort": true, 00:13:50.056 "seek_hole": false, 00:13:50.056 "seek_data": false, 00:13:50.056 "copy": true, 00:13:50.056 "nvme_iov_md": false 00:13:50.056 }, 00:13:50.056 "memory_domains": [ 00:13:50.056 { 00:13:50.056 "dma_device_id": "system", 00:13:50.056 "dma_device_type": 1 00:13:50.056 }, 00:13:50.056 { 00:13:50.056 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.056 "dma_device_type": 2 00:13:50.056 } 00:13:50.056 ], 00:13:50.056 "driver_specific": {} 00:13:50.056 } 00:13:50.056 ] 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.056 11:55:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.622 11:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.622 "name": "Existed_Raid", 00:13:50.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.622 "strip_size_kb": 0, 00:13:50.622 "state": "configuring", 00:13:50.622 "raid_level": "raid1", 00:13:50.622 "superblock": false, 00:13:50.622 "num_base_bdevs": 2, 00:13:50.622 "num_base_bdevs_discovered": 1, 00:13:50.622 "num_base_bdevs_operational": 2, 00:13:50.622 "base_bdevs_list": [ 00:13:50.622 { 00:13:50.622 "name": "BaseBdev1", 00:13:50.622 "uuid": "57fe2723-b9c1-486a-98ee-7ec6c8704184", 00:13:50.622 "is_configured": true, 00:13:50.622 "data_offset": 0, 00:13:50.622 "data_size": 65536 00:13:50.622 }, 00:13:50.622 { 00:13:50.622 "name": "BaseBdev2", 00:13:50.622 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:50.622 "is_configured": false, 00:13:50.622 "data_offset": 0, 00:13:50.622 "data_size": 0 00:13:50.622 } 00:13:50.622 ] 00:13:50.622 }' 00:13:50.622 11:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.622 11:55:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:51.205 11:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:51.464 [2024-07-15 11:55:04.878740] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:51.464 [2024-07-15 11:55:04.878778] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18953d0 name Existed_Raid, state configuring 00:13:51.464 11:55:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:51.722 [2024-07-15 11:55:05.127405] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:51.722 [2024-07-15 11:55:05.128884] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:51.722 [2024-07-15 11:55:05.128917] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:51.722 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:51.981 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:51.981 "name": "Existed_Raid", 00:13:51.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:51.981 "strip_size_kb": 0, 00:13:51.981 "state": "configuring", 00:13:51.981 "raid_level": "raid1", 00:13:51.981 "superblock": false, 00:13:51.981 "num_base_bdevs": 2, 00:13:51.981 "num_base_bdevs_discovered": 1, 00:13:51.981 "num_base_bdevs_operational": 2, 00:13:51.981 "base_bdevs_list": [ 00:13:51.981 { 00:13:51.981 "name": "BaseBdev1", 00:13:51.981 "uuid": "57fe2723-b9c1-486a-98ee-7ec6c8704184", 00:13:51.981 "is_configured": true, 00:13:51.981 "data_offset": 0, 00:13:51.981 "data_size": 65536 00:13:51.981 }, 00:13:51.981 { 00:13:51.981 "name": "BaseBdev2", 00:13:51.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:51.981 "is_configured": false, 00:13:51.981 "data_offset": 0, 00:13:51.981 "data_size": 0 00:13:51.981 } 00:13:51.981 ] 00:13:51.981 }' 00:13:51.981 11:55:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:51.981 11:55:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:52.549 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:52.808 [2024-07-15 11:55:06.277854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:52.808 [2024-07-15 11:55:06.277892] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1896150 00:13:52.808 [2024-07-15 11:55:06.277901] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:13:52.808 [2024-07-15 11:55:06.278093] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b0420 00:13:52.808 [2024-07-15 11:55:06.278213] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1896150 00:13:52.808 [2024-07-15 11:55:06.278223] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1896150 00:13:52.808 [2024-07-15 11:55:06.278385] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:52.808 BaseBdev2 00:13:52.808 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:52.808 11:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:13:52.808 11:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:13:52.808 11:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:13:52.808 11:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:13:52.808 11:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:13:52.808 11:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:53.067 11:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:53.326 [ 00:13:53.326 { 00:13:53.326 "name": "BaseBdev2", 00:13:53.326 "aliases": [ 00:13:53.326 "a6e5d201-c166-4b8a-8c0a-a611625c5a6e" 00:13:53.326 ], 00:13:53.326 "product_name": "Malloc disk", 00:13:53.326 "block_size": 512, 00:13:53.326 "num_blocks": 65536, 00:13:53.326 "uuid": "a6e5d201-c166-4b8a-8c0a-a611625c5a6e", 00:13:53.326 "assigned_rate_limits": { 00:13:53.326 "rw_ios_per_sec": 0, 00:13:53.326 "rw_mbytes_per_sec": 0, 00:13:53.326 "r_mbytes_per_sec": 0, 00:13:53.326 "w_mbytes_per_sec": 0 00:13:53.326 }, 00:13:53.326 "claimed": true, 00:13:53.326 "claim_type": "exclusive_write", 00:13:53.327 "zoned": false, 00:13:53.327 "supported_io_types": { 00:13:53.327 "read": true, 00:13:53.327 "write": true, 00:13:53.327 "unmap": true, 00:13:53.327 "flush": true, 00:13:53.327 "reset": true, 00:13:53.327 "nvme_admin": false, 00:13:53.327 "nvme_io": false, 00:13:53.327 "nvme_io_md": false, 00:13:53.327 "write_zeroes": true, 00:13:53.327 "zcopy": true, 00:13:53.327 "get_zone_info": false, 00:13:53.327 "zone_management": false, 00:13:53.327 "zone_append": false, 00:13:53.327 "compare": false, 00:13:53.327 "compare_and_write": false, 00:13:53.327 "abort": true, 00:13:53.327 "seek_hole": false, 00:13:53.327 "seek_data": false, 00:13:53.327 "copy": true, 00:13:53.327 "nvme_iov_md": false 00:13:53.327 }, 00:13:53.327 "memory_domains": [ 00:13:53.327 { 00:13:53.327 "dma_device_id": "system", 00:13:53.327 "dma_device_type": 1 00:13:53.327 }, 00:13:53.327 { 00:13:53.327 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:53.327 "dma_device_type": 2 00:13:53.327 } 00:13:53.327 ], 00:13:53.327 "driver_specific": {} 00:13:53.327 } 00:13:53.327 ] 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.327 11:55:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.586 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.586 "name": "Existed_Raid", 00:13:53.586 "uuid": "aa9f8496-9a5d-4449-954e-10161be780e0", 00:13:53.586 "strip_size_kb": 0, 00:13:53.586 "state": "online", 00:13:53.586 "raid_level": "raid1", 00:13:53.586 "superblock": false, 00:13:53.586 "num_base_bdevs": 2, 00:13:53.586 "num_base_bdevs_discovered": 2, 00:13:53.586 "num_base_bdevs_operational": 2, 00:13:53.586 "base_bdevs_list": [ 00:13:53.586 { 00:13:53.586 "name": "BaseBdev1", 00:13:53.586 "uuid": "57fe2723-b9c1-486a-98ee-7ec6c8704184", 00:13:53.586 "is_configured": true, 00:13:53.586 "data_offset": 0, 00:13:53.586 "data_size": 65536 00:13:53.586 }, 00:13:53.586 { 00:13:53.586 "name": "BaseBdev2", 00:13:53.586 "uuid": "a6e5d201-c166-4b8a-8c0a-a611625c5a6e", 00:13:53.586 "is_configured": true, 00:13:53.586 "data_offset": 0, 00:13:53.586 "data_size": 65536 00:13:53.586 } 00:13:53.586 ] 00:13:53.586 }' 00:13:53.586 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.586 11:55:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:54.156 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:54.156 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:54.156 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:54.156 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:54.156 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:54.156 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:54.156 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:54.156 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:54.415 [2024-07-15 11:55:07.870410] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:54.415 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:54.415 "name": "Existed_Raid", 00:13:54.415 "aliases": [ 00:13:54.415 "aa9f8496-9a5d-4449-954e-10161be780e0" 00:13:54.415 ], 00:13:54.415 "product_name": "Raid Volume", 00:13:54.415 "block_size": 512, 00:13:54.415 "num_blocks": 65536, 00:13:54.415 "uuid": "aa9f8496-9a5d-4449-954e-10161be780e0", 00:13:54.415 "assigned_rate_limits": { 00:13:54.415 "rw_ios_per_sec": 0, 00:13:54.415 "rw_mbytes_per_sec": 0, 00:13:54.415 "r_mbytes_per_sec": 0, 00:13:54.415 "w_mbytes_per_sec": 0 00:13:54.415 }, 00:13:54.415 "claimed": false, 00:13:54.415 "zoned": false, 00:13:54.415 "supported_io_types": { 00:13:54.415 "read": true, 00:13:54.415 "write": true, 00:13:54.415 "unmap": false, 00:13:54.415 "flush": false, 00:13:54.415 "reset": true, 00:13:54.415 "nvme_admin": false, 00:13:54.415 "nvme_io": false, 00:13:54.415 "nvme_io_md": false, 00:13:54.415 "write_zeroes": true, 00:13:54.415 "zcopy": false, 00:13:54.415 "get_zone_info": false, 00:13:54.415 "zone_management": false, 00:13:54.415 "zone_append": false, 00:13:54.415 "compare": false, 00:13:54.415 "compare_and_write": false, 00:13:54.415 "abort": false, 00:13:54.415 "seek_hole": false, 00:13:54.415 "seek_data": false, 00:13:54.415 "copy": false, 00:13:54.415 "nvme_iov_md": false 00:13:54.415 }, 00:13:54.415 "memory_domains": [ 00:13:54.415 { 00:13:54.415 "dma_device_id": "system", 00:13:54.415 "dma_device_type": 1 00:13:54.415 }, 00:13:54.415 { 00:13:54.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.415 "dma_device_type": 2 00:13:54.415 }, 00:13:54.415 { 00:13:54.415 "dma_device_id": "system", 00:13:54.415 "dma_device_type": 1 00:13:54.415 }, 00:13:54.415 { 00:13:54.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.415 "dma_device_type": 2 00:13:54.415 } 00:13:54.415 ], 00:13:54.415 "driver_specific": { 00:13:54.415 "raid": { 00:13:54.415 "uuid": "aa9f8496-9a5d-4449-954e-10161be780e0", 00:13:54.415 "strip_size_kb": 0, 00:13:54.415 "state": "online", 00:13:54.415 "raid_level": "raid1", 00:13:54.415 "superblock": false, 00:13:54.415 "num_base_bdevs": 2, 00:13:54.415 "num_base_bdevs_discovered": 2, 00:13:54.415 "num_base_bdevs_operational": 2, 00:13:54.415 "base_bdevs_list": [ 00:13:54.415 { 00:13:54.415 "name": "BaseBdev1", 00:13:54.415 "uuid": "57fe2723-b9c1-486a-98ee-7ec6c8704184", 00:13:54.415 "is_configured": true, 00:13:54.415 "data_offset": 0, 00:13:54.415 "data_size": 65536 00:13:54.415 }, 00:13:54.415 { 00:13:54.415 "name": "BaseBdev2", 00:13:54.415 "uuid": "a6e5d201-c166-4b8a-8c0a-a611625c5a6e", 00:13:54.415 "is_configured": true, 00:13:54.415 "data_offset": 0, 00:13:54.415 "data_size": 65536 00:13:54.415 } 00:13:54.415 ] 00:13:54.415 } 00:13:54.415 } 00:13:54.415 }' 00:13:54.416 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:54.416 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:54.416 BaseBdev2' 00:13:54.416 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:54.416 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:54.416 11:55:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:54.675 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:54.675 "name": "BaseBdev1", 00:13:54.675 "aliases": [ 00:13:54.675 "57fe2723-b9c1-486a-98ee-7ec6c8704184" 00:13:54.675 ], 00:13:54.675 "product_name": "Malloc disk", 00:13:54.675 "block_size": 512, 00:13:54.675 "num_blocks": 65536, 00:13:54.675 "uuid": "57fe2723-b9c1-486a-98ee-7ec6c8704184", 00:13:54.675 "assigned_rate_limits": { 00:13:54.675 "rw_ios_per_sec": 0, 00:13:54.675 "rw_mbytes_per_sec": 0, 00:13:54.675 "r_mbytes_per_sec": 0, 00:13:54.675 "w_mbytes_per_sec": 0 00:13:54.675 }, 00:13:54.675 "claimed": true, 00:13:54.675 "claim_type": "exclusive_write", 00:13:54.675 "zoned": false, 00:13:54.675 "supported_io_types": { 00:13:54.675 "read": true, 00:13:54.675 "write": true, 00:13:54.675 "unmap": true, 00:13:54.675 "flush": true, 00:13:54.675 "reset": true, 00:13:54.675 "nvme_admin": false, 00:13:54.675 "nvme_io": false, 00:13:54.675 "nvme_io_md": false, 00:13:54.675 "write_zeroes": true, 00:13:54.675 "zcopy": true, 00:13:54.675 "get_zone_info": false, 00:13:54.675 "zone_management": false, 00:13:54.675 "zone_append": false, 00:13:54.675 "compare": false, 00:13:54.675 "compare_and_write": false, 00:13:54.675 "abort": true, 00:13:54.675 "seek_hole": false, 00:13:54.675 "seek_data": false, 00:13:54.675 "copy": true, 00:13:54.675 "nvme_iov_md": false 00:13:54.675 }, 00:13:54.675 "memory_domains": [ 00:13:54.675 { 00:13:54.675 "dma_device_id": "system", 00:13:54.675 "dma_device_type": 1 00:13:54.675 }, 00:13:54.675 { 00:13:54.675 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.675 "dma_device_type": 2 00:13:54.675 } 00:13:54.675 ], 00:13:54.675 "driver_specific": {} 00:13:54.675 }' 00:13:54.675 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.675 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:54.675 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:54.993 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:55.253 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:55.253 "name": "BaseBdev2", 00:13:55.253 "aliases": [ 00:13:55.253 "a6e5d201-c166-4b8a-8c0a-a611625c5a6e" 00:13:55.253 ], 00:13:55.253 "product_name": "Malloc disk", 00:13:55.253 "block_size": 512, 00:13:55.253 "num_blocks": 65536, 00:13:55.253 "uuid": "a6e5d201-c166-4b8a-8c0a-a611625c5a6e", 00:13:55.253 "assigned_rate_limits": { 00:13:55.253 "rw_ios_per_sec": 0, 00:13:55.253 "rw_mbytes_per_sec": 0, 00:13:55.253 "r_mbytes_per_sec": 0, 00:13:55.253 "w_mbytes_per_sec": 0 00:13:55.253 }, 00:13:55.253 "claimed": true, 00:13:55.253 "claim_type": "exclusive_write", 00:13:55.253 "zoned": false, 00:13:55.253 "supported_io_types": { 00:13:55.253 "read": true, 00:13:55.253 "write": true, 00:13:55.253 "unmap": true, 00:13:55.253 "flush": true, 00:13:55.253 "reset": true, 00:13:55.253 "nvme_admin": false, 00:13:55.253 "nvme_io": false, 00:13:55.253 "nvme_io_md": false, 00:13:55.253 "write_zeroes": true, 00:13:55.253 "zcopy": true, 00:13:55.253 "get_zone_info": false, 00:13:55.253 "zone_management": false, 00:13:55.253 "zone_append": false, 00:13:55.253 "compare": false, 00:13:55.253 "compare_and_write": false, 00:13:55.253 "abort": true, 00:13:55.253 "seek_hole": false, 00:13:55.253 "seek_data": false, 00:13:55.253 "copy": true, 00:13:55.253 "nvme_iov_md": false 00:13:55.253 }, 00:13:55.253 "memory_domains": [ 00:13:55.253 { 00:13:55.253 "dma_device_id": "system", 00:13:55.253 "dma_device_type": 1 00:13:55.253 }, 00:13:55.253 { 00:13:55.253 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:55.253 "dma_device_type": 2 00:13:55.253 } 00:13:55.253 ], 00:13:55.253 "driver_specific": {} 00:13:55.253 }' 00:13:55.253 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:55.253 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:55.512 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:55.512 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:55.512 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:55.513 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:55.513 11:55:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:55.513 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:55.513 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:55.513 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:55.513 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:55.772 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:55.772 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:55.772 [2024-07-15 11:55:09.362154] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.031 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.290 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.290 "name": "Existed_Raid", 00:13:56.290 "uuid": "aa9f8496-9a5d-4449-954e-10161be780e0", 00:13:56.290 "strip_size_kb": 0, 00:13:56.290 "state": "online", 00:13:56.290 "raid_level": "raid1", 00:13:56.290 "superblock": false, 00:13:56.290 "num_base_bdevs": 2, 00:13:56.290 "num_base_bdevs_discovered": 1, 00:13:56.290 "num_base_bdevs_operational": 1, 00:13:56.290 "base_bdevs_list": [ 00:13:56.290 { 00:13:56.290 "name": null, 00:13:56.290 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.290 "is_configured": false, 00:13:56.290 "data_offset": 0, 00:13:56.290 "data_size": 65536 00:13:56.290 }, 00:13:56.290 { 00:13:56.290 "name": "BaseBdev2", 00:13:56.290 "uuid": "a6e5d201-c166-4b8a-8c0a-a611625c5a6e", 00:13:56.290 "is_configured": true, 00:13:56.290 "data_offset": 0, 00:13:56.290 "data_size": 65536 00:13:56.290 } 00:13:56.290 ] 00:13:56.290 }' 00:13:56.290 11:55:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.290 11:55:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:56.858 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:56.858 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:56.858 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.858 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:57.117 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:57.117 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:57.117 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:57.376 [2024-07-15 11:55:10.726751] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:57.376 [2024-07-15 11:55:10.726827] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:57.376 [2024-07-15 11:55:10.737593] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:57.376 [2024-07-15 11:55:10.737624] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:57.376 [2024-07-15 11:55:10.737635] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1896150 name Existed_Raid, state offline 00:13:57.376 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:57.376 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:57.376 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.376 11:55:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1468370 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1468370 ']' 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1468370 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1468370 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1468370' 00:13:57.636 killing process with pid 1468370 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1468370 00:13:57.636 [2024-07-15 11:55:11.067137] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:57.636 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1468370 00:13:57.636 [2024-07-15 11:55:11.068022] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:57.907 11:55:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:57.907 00:13:57.907 real 0m12.031s 00:13:57.907 user 0m21.528s 00:13:57.907 sys 0m2.131s 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:57.908 ************************************ 00:13:57.908 END TEST raid_state_function_test 00:13:57.908 ************************************ 00:13:57.908 11:55:11 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:13:57.908 11:55:11 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:13:57.908 11:55:11 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:13:57.908 11:55:11 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:13:57.908 11:55:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:57.908 ************************************ 00:13:57.908 START TEST raid_state_function_test_sb 00:13:57.908 ************************************ 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1470121 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1470121' 00:13:57.908 Process raid pid: 1470121 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1470121 /var/tmp/spdk-raid.sock 00:13:57.908 11:55:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1470121 ']' 00:13:57.909 11:55:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:57.909 11:55:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:13:57.909 11:55:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:57.909 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:57.909 11:55:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:13:57.909 11:55:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.909 [2024-07-15 11:55:11.439205] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:13:57.909 [2024-07-15 11:55:11.439270] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:58.173 [2024-07-15 11:55:11.571648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:58.173 [2024-07-15 11:55:11.676695] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:58.173 [2024-07-15 11:55:11.744397] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:58.173 [2024-07-15 11:55:11.744433] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:13:59.109 [2024-07-15 11:55:12.599264] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:59.109 [2024-07-15 11:55:12.599309] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:59.109 [2024-07-15 11:55:12.599319] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:59.109 [2024-07-15 11:55:12.599331] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.109 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.368 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.368 "name": "Existed_Raid", 00:13:59.368 "uuid": "6b70f93f-9ea9-4ac6-b548-1990a059a646", 00:13:59.368 "strip_size_kb": 0, 00:13:59.368 "state": "configuring", 00:13:59.368 "raid_level": "raid1", 00:13:59.368 "superblock": true, 00:13:59.368 "num_base_bdevs": 2, 00:13:59.368 "num_base_bdevs_discovered": 0, 00:13:59.368 "num_base_bdevs_operational": 2, 00:13:59.368 "base_bdevs_list": [ 00:13:59.368 { 00:13:59.368 "name": "BaseBdev1", 00:13:59.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.368 "is_configured": false, 00:13:59.368 "data_offset": 0, 00:13:59.368 "data_size": 0 00:13:59.368 }, 00:13:59.368 { 00:13:59.368 "name": "BaseBdev2", 00:13:59.368 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:59.368 "is_configured": false, 00:13:59.368 "data_offset": 0, 00:13:59.368 "data_size": 0 00:13:59.368 } 00:13:59.368 ] 00:13:59.368 }' 00:13:59.368 11:55:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.368 11:55:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.935 11:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:00.193 [2024-07-15 11:55:13.734116] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:00.193 [2024-07-15 11:55:13.734148] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11f3b00 name Existed_Raid, state configuring 00:14:00.193 11:55:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:00.451 [2024-07-15 11:55:13.982795] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:00.451 [2024-07-15 11:55:13.982823] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:00.451 [2024-07-15 11:55:13.982832] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:00.451 [2024-07-15 11:55:13.982844] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:00.451 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:00.710 [2024-07-15 11:55:14.289340] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:00.710 BaseBdev1 00:14:00.968 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:00.968 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:00.968 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:00.968 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:00.968 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:00.968 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:00.968 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:00.968 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:01.226 [ 00:14:01.226 { 00:14:01.226 "name": "BaseBdev1", 00:14:01.226 "aliases": [ 00:14:01.226 "63596cf7-602c-48e7-ac0a-1a3e4c71ba65" 00:14:01.226 ], 00:14:01.226 "product_name": "Malloc disk", 00:14:01.226 "block_size": 512, 00:14:01.226 "num_blocks": 65536, 00:14:01.226 "uuid": "63596cf7-602c-48e7-ac0a-1a3e4c71ba65", 00:14:01.226 "assigned_rate_limits": { 00:14:01.226 "rw_ios_per_sec": 0, 00:14:01.226 "rw_mbytes_per_sec": 0, 00:14:01.226 "r_mbytes_per_sec": 0, 00:14:01.226 "w_mbytes_per_sec": 0 00:14:01.226 }, 00:14:01.226 "claimed": true, 00:14:01.226 "claim_type": "exclusive_write", 00:14:01.226 "zoned": false, 00:14:01.226 "supported_io_types": { 00:14:01.226 "read": true, 00:14:01.226 "write": true, 00:14:01.226 "unmap": true, 00:14:01.226 "flush": true, 00:14:01.226 "reset": true, 00:14:01.226 "nvme_admin": false, 00:14:01.226 "nvme_io": false, 00:14:01.226 "nvme_io_md": false, 00:14:01.226 "write_zeroes": true, 00:14:01.226 "zcopy": true, 00:14:01.226 "get_zone_info": false, 00:14:01.226 "zone_management": false, 00:14:01.226 "zone_append": false, 00:14:01.226 "compare": false, 00:14:01.226 "compare_and_write": false, 00:14:01.226 "abort": true, 00:14:01.226 "seek_hole": false, 00:14:01.226 "seek_data": false, 00:14:01.226 "copy": true, 00:14:01.226 "nvme_iov_md": false 00:14:01.226 }, 00:14:01.226 "memory_domains": [ 00:14:01.226 { 00:14:01.226 "dma_device_id": "system", 00:14:01.226 "dma_device_type": 1 00:14:01.226 }, 00:14:01.226 { 00:14:01.226 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.226 "dma_device_type": 2 00:14:01.226 } 00:14:01.226 ], 00:14:01.226 "driver_specific": {} 00:14:01.226 } 00:14:01.226 ] 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.226 11:55:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.484 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.484 "name": "Existed_Raid", 00:14:01.484 "uuid": "da44d69a-4d91-4857-8173-e7c86f745991", 00:14:01.484 "strip_size_kb": 0, 00:14:01.484 "state": "configuring", 00:14:01.484 "raid_level": "raid1", 00:14:01.484 "superblock": true, 00:14:01.484 "num_base_bdevs": 2, 00:14:01.484 "num_base_bdevs_discovered": 1, 00:14:01.484 "num_base_bdevs_operational": 2, 00:14:01.484 "base_bdevs_list": [ 00:14:01.484 { 00:14:01.484 "name": "BaseBdev1", 00:14:01.484 "uuid": "63596cf7-602c-48e7-ac0a-1a3e4c71ba65", 00:14:01.484 "is_configured": true, 00:14:01.484 "data_offset": 2048, 00:14:01.484 "data_size": 63488 00:14:01.484 }, 00:14:01.484 { 00:14:01.484 "name": "BaseBdev2", 00:14:01.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:01.484 "is_configured": false, 00:14:01.484 "data_offset": 0, 00:14:01.484 "data_size": 0 00:14:01.484 } 00:14:01.484 ] 00:14:01.484 }' 00:14:01.484 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.484 11:55:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:02.418 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:02.418 [2024-07-15 11:55:15.925842] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:02.418 [2024-07-15 11:55:15.925885] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11f33d0 name Existed_Raid, state configuring 00:14:02.418 11:55:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:14:02.677 [2024-07-15 11:55:16.174528] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:02.677 [2024-07-15 11:55:16.176035] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:02.677 [2024-07-15 11:55:16.176069] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.677 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.935 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.935 "name": "Existed_Raid", 00:14:02.935 "uuid": "6e76eecf-f510-4376-abbf-3995fc872cd6", 00:14:02.935 "strip_size_kb": 0, 00:14:02.935 "state": "configuring", 00:14:02.935 "raid_level": "raid1", 00:14:02.935 "superblock": true, 00:14:02.935 "num_base_bdevs": 2, 00:14:02.935 "num_base_bdevs_discovered": 1, 00:14:02.935 "num_base_bdevs_operational": 2, 00:14:02.935 "base_bdevs_list": [ 00:14:02.935 { 00:14:02.935 "name": "BaseBdev1", 00:14:02.935 "uuid": "63596cf7-602c-48e7-ac0a-1a3e4c71ba65", 00:14:02.935 "is_configured": true, 00:14:02.935 "data_offset": 2048, 00:14:02.935 "data_size": 63488 00:14:02.935 }, 00:14:02.935 { 00:14:02.935 "name": "BaseBdev2", 00:14:02.935 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:02.935 "is_configured": false, 00:14:02.935 "data_offset": 0, 00:14:02.935 "data_size": 0 00:14:02.935 } 00:14:02.935 ] 00:14:02.935 }' 00:14:02.935 11:55:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.935 11:55:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.870 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:04.128 [2024-07-15 11:55:17.549412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:04.128 [2024-07-15 11:55:17.549556] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11f4150 00:14:04.128 [2024-07-15 11:55:17.549569] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:04.128 [2024-07-15 11:55:17.549764] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x110e420 00:14:04.128 [2024-07-15 11:55:17.549886] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11f4150 00:14:04.128 [2024-07-15 11:55:17.549897] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11f4150 00:14:04.128 [2024-07-15 11:55:17.549988] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:04.128 BaseBdev2 00:14:04.128 11:55:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:04.128 11:55:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:04.128 11:55:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:04.128 11:55:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:14:04.128 11:55:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:04.128 11:55:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:04.128 11:55:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.386 11:55:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:04.645 [ 00:14:04.645 { 00:14:04.645 "name": "BaseBdev2", 00:14:04.645 "aliases": [ 00:14:04.645 "ca4548d3-8c93-4d2b-ad36-21675014029c" 00:14:04.645 ], 00:14:04.645 "product_name": "Malloc disk", 00:14:04.645 "block_size": 512, 00:14:04.645 "num_blocks": 65536, 00:14:04.645 "uuid": "ca4548d3-8c93-4d2b-ad36-21675014029c", 00:14:04.645 "assigned_rate_limits": { 00:14:04.645 "rw_ios_per_sec": 0, 00:14:04.645 "rw_mbytes_per_sec": 0, 00:14:04.645 "r_mbytes_per_sec": 0, 00:14:04.645 "w_mbytes_per_sec": 0 00:14:04.645 }, 00:14:04.645 "claimed": true, 00:14:04.645 "claim_type": "exclusive_write", 00:14:04.645 "zoned": false, 00:14:04.645 "supported_io_types": { 00:14:04.645 "read": true, 00:14:04.645 "write": true, 00:14:04.645 "unmap": true, 00:14:04.645 "flush": true, 00:14:04.645 "reset": true, 00:14:04.645 "nvme_admin": false, 00:14:04.645 "nvme_io": false, 00:14:04.645 "nvme_io_md": false, 00:14:04.645 "write_zeroes": true, 00:14:04.645 "zcopy": true, 00:14:04.645 "get_zone_info": false, 00:14:04.645 "zone_management": false, 00:14:04.645 "zone_append": false, 00:14:04.645 "compare": false, 00:14:04.645 "compare_and_write": false, 00:14:04.645 "abort": true, 00:14:04.645 "seek_hole": false, 00:14:04.645 "seek_data": false, 00:14:04.645 "copy": true, 00:14:04.645 "nvme_iov_md": false 00:14:04.645 }, 00:14:04.645 "memory_domains": [ 00:14:04.645 { 00:14:04.645 "dma_device_id": "system", 00:14:04.645 "dma_device_type": 1 00:14:04.645 }, 00:14:04.645 { 00:14:04.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:04.645 "dma_device_type": 2 00:14:04.645 } 00:14:04.645 ], 00:14:04.645 "driver_specific": {} 00:14:04.645 } 00:14:04.645 ] 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.645 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:04.904 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:04.904 "name": "Existed_Raid", 00:14:04.904 "uuid": "6e76eecf-f510-4376-abbf-3995fc872cd6", 00:14:04.904 "strip_size_kb": 0, 00:14:04.904 "state": "online", 00:14:04.904 "raid_level": "raid1", 00:14:04.904 "superblock": true, 00:14:04.904 "num_base_bdevs": 2, 00:14:04.904 "num_base_bdevs_discovered": 2, 00:14:04.904 "num_base_bdevs_operational": 2, 00:14:04.904 "base_bdevs_list": [ 00:14:04.904 { 00:14:04.904 "name": "BaseBdev1", 00:14:04.904 "uuid": "63596cf7-602c-48e7-ac0a-1a3e4c71ba65", 00:14:04.904 "is_configured": true, 00:14:04.904 "data_offset": 2048, 00:14:04.904 "data_size": 63488 00:14:04.904 }, 00:14:04.904 { 00:14:04.904 "name": "BaseBdev2", 00:14:04.904 "uuid": "ca4548d3-8c93-4d2b-ad36-21675014029c", 00:14:04.904 "is_configured": true, 00:14:04.904 "data_offset": 2048, 00:14:04.904 "data_size": 63488 00:14:04.904 } 00:14:04.904 ] 00:14:04.904 }' 00:14:04.904 11:55:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:04.904 11:55:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:05.839 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:05.839 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:05.839 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:05.839 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:05.839 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:05.839 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:05.839 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:05.839 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:05.839 [2024-07-15 11:55:19.426667] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:06.097 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:06.097 "name": "Existed_Raid", 00:14:06.097 "aliases": [ 00:14:06.097 "6e76eecf-f510-4376-abbf-3995fc872cd6" 00:14:06.097 ], 00:14:06.097 "product_name": "Raid Volume", 00:14:06.097 "block_size": 512, 00:14:06.097 "num_blocks": 63488, 00:14:06.097 "uuid": "6e76eecf-f510-4376-abbf-3995fc872cd6", 00:14:06.097 "assigned_rate_limits": { 00:14:06.097 "rw_ios_per_sec": 0, 00:14:06.097 "rw_mbytes_per_sec": 0, 00:14:06.097 "r_mbytes_per_sec": 0, 00:14:06.097 "w_mbytes_per_sec": 0 00:14:06.097 }, 00:14:06.097 "claimed": false, 00:14:06.097 "zoned": false, 00:14:06.097 "supported_io_types": { 00:14:06.097 "read": true, 00:14:06.097 "write": true, 00:14:06.097 "unmap": false, 00:14:06.097 "flush": false, 00:14:06.097 "reset": true, 00:14:06.097 "nvme_admin": false, 00:14:06.097 "nvme_io": false, 00:14:06.097 "nvme_io_md": false, 00:14:06.097 "write_zeroes": true, 00:14:06.097 "zcopy": false, 00:14:06.097 "get_zone_info": false, 00:14:06.097 "zone_management": false, 00:14:06.097 "zone_append": false, 00:14:06.097 "compare": false, 00:14:06.097 "compare_and_write": false, 00:14:06.097 "abort": false, 00:14:06.097 "seek_hole": false, 00:14:06.097 "seek_data": false, 00:14:06.097 "copy": false, 00:14:06.097 "nvme_iov_md": false 00:14:06.097 }, 00:14:06.097 "memory_domains": [ 00:14:06.097 { 00:14:06.097 "dma_device_id": "system", 00:14:06.097 "dma_device_type": 1 00:14:06.097 }, 00:14:06.097 { 00:14:06.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.097 "dma_device_type": 2 00:14:06.097 }, 00:14:06.097 { 00:14:06.097 "dma_device_id": "system", 00:14:06.097 "dma_device_type": 1 00:14:06.097 }, 00:14:06.097 { 00:14:06.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.097 "dma_device_type": 2 00:14:06.097 } 00:14:06.097 ], 00:14:06.097 "driver_specific": { 00:14:06.097 "raid": { 00:14:06.097 "uuid": "6e76eecf-f510-4376-abbf-3995fc872cd6", 00:14:06.097 "strip_size_kb": 0, 00:14:06.097 "state": "online", 00:14:06.097 "raid_level": "raid1", 00:14:06.097 "superblock": true, 00:14:06.097 "num_base_bdevs": 2, 00:14:06.097 "num_base_bdevs_discovered": 2, 00:14:06.097 "num_base_bdevs_operational": 2, 00:14:06.097 "base_bdevs_list": [ 00:14:06.097 { 00:14:06.097 "name": "BaseBdev1", 00:14:06.097 "uuid": "63596cf7-602c-48e7-ac0a-1a3e4c71ba65", 00:14:06.097 "is_configured": true, 00:14:06.097 "data_offset": 2048, 00:14:06.097 "data_size": 63488 00:14:06.097 }, 00:14:06.097 { 00:14:06.097 "name": "BaseBdev2", 00:14:06.097 "uuid": "ca4548d3-8c93-4d2b-ad36-21675014029c", 00:14:06.097 "is_configured": true, 00:14:06.097 "data_offset": 2048, 00:14:06.097 "data_size": 63488 00:14:06.097 } 00:14:06.097 ] 00:14:06.097 } 00:14:06.097 } 00:14:06.097 }' 00:14:06.097 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:06.097 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:06.097 BaseBdev2' 00:14:06.097 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.097 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:06.097 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.356 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.356 "name": "BaseBdev1", 00:14:06.356 "aliases": [ 00:14:06.356 "63596cf7-602c-48e7-ac0a-1a3e4c71ba65" 00:14:06.356 ], 00:14:06.356 "product_name": "Malloc disk", 00:14:06.356 "block_size": 512, 00:14:06.356 "num_blocks": 65536, 00:14:06.356 "uuid": "63596cf7-602c-48e7-ac0a-1a3e4c71ba65", 00:14:06.356 "assigned_rate_limits": { 00:14:06.356 "rw_ios_per_sec": 0, 00:14:06.356 "rw_mbytes_per_sec": 0, 00:14:06.356 "r_mbytes_per_sec": 0, 00:14:06.356 "w_mbytes_per_sec": 0 00:14:06.356 }, 00:14:06.356 "claimed": true, 00:14:06.356 "claim_type": "exclusive_write", 00:14:06.356 "zoned": false, 00:14:06.356 "supported_io_types": { 00:14:06.356 "read": true, 00:14:06.356 "write": true, 00:14:06.356 "unmap": true, 00:14:06.356 "flush": true, 00:14:06.356 "reset": true, 00:14:06.356 "nvme_admin": false, 00:14:06.356 "nvme_io": false, 00:14:06.356 "nvme_io_md": false, 00:14:06.356 "write_zeroes": true, 00:14:06.356 "zcopy": true, 00:14:06.356 "get_zone_info": false, 00:14:06.356 "zone_management": false, 00:14:06.356 "zone_append": false, 00:14:06.356 "compare": false, 00:14:06.356 "compare_and_write": false, 00:14:06.356 "abort": true, 00:14:06.356 "seek_hole": false, 00:14:06.356 "seek_data": false, 00:14:06.356 "copy": true, 00:14:06.356 "nvme_iov_md": false 00:14:06.356 }, 00:14:06.356 "memory_domains": [ 00:14:06.356 { 00:14:06.356 "dma_device_id": "system", 00:14:06.356 "dma_device_type": 1 00:14:06.356 }, 00:14:06.356 { 00:14:06.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.356 "dma_device_type": 2 00:14:06.356 } 00:14:06.356 ], 00:14:06.356 "driver_specific": {} 00:14:06.356 }' 00:14:06.356 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.356 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.356 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.356 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.356 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.356 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:06.356 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.356 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.614 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.614 11:55:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.614 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.614 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:06.614 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.614 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:06.614 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.872 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.872 "name": "BaseBdev2", 00:14:06.872 "aliases": [ 00:14:06.872 "ca4548d3-8c93-4d2b-ad36-21675014029c" 00:14:06.872 ], 00:14:06.872 "product_name": "Malloc disk", 00:14:06.872 "block_size": 512, 00:14:06.872 "num_blocks": 65536, 00:14:06.872 "uuid": "ca4548d3-8c93-4d2b-ad36-21675014029c", 00:14:06.872 "assigned_rate_limits": { 00:14:06.872 "rw_ios_per_sec": 0, 00:14:06.872 "rw_mbytes_per_sec": 0, 00:14:06.872 "r_mbytes_per_sec": 0, 00:14:06.872 "w_mbytes_per_sec": 0 00:14:06.872 }, 00:14:06.872 "claimed": true, 00:14:06.872 "claim_type": "exclusive_write", 00:14:06.872 "zoned": false, 00:14:06.872 "supported_io_types": { 00:14:06.872 "read": true, 00:14:06.872 "write": true, 00:14:06.872 "unmap": true, 00:14:06.872 "flush": true, 00:14:06.872 "reset": true, 00:14:06.872 "nvme_admin": false, 00:14:06.872 "nvme_io": false, 00:14:06.872 "nvme_io_md": false, 00:14:06.872 "write_zeroes": true, 00:14:06.872 "zcopy": true, 00:14:06.872 "get_zone_info": false, 00:14:06.872 "zone_management": false, 00:14:06.872 "zone_append": false, 00:14:06.872 "compare": false, 00:14:06.872 "compare_and_write": false, 00:14:06.872 "abort": true, 00:14:06.872 "seek_hole": false, 00:14:06.872 "seek_data": false, 00:14:06.872 "copy": true, 00:14:06.872 "nvme_iov_md": false 00:14:06.872 }, 00:14:06.872 "memory_domains": [ 00:14:06.872 { 00:14:06.872 "dma_device_id": "system", 00:14:06.872 "dma_device_type": 1 00:14:06.872 }, 00:14:06.872 { 00:14:06.872 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.872 "dma_device_type": 2 00:14:06.872 } 00:14:06.872 ], 00:14:06.872 "driver_specific": {} 00:14:06.872 }' 00:14:06.872 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.872 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.872 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.872 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.131 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.131 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.131 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.131 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.131 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.131 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.131 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.131 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.131 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:07.389 [2024-07-15 11:55:20.906388] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:07.389 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:07.389 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:14:07.389 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:07.389 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:14:07.389 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:14:07.389 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:14:07.389 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.389 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:07.389 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:07.390 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:07.390 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:07.390 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.390 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.390 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.390 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.390 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.390 11:55:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.647 11:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.647 "name": "Existed_Raid", 00:14:07.647 "uuid": "6e76eecf-f510-4376-abbf-3995fc872cd6", 00:14:07.647 "strip_size_kb": 0, 00:14:07.647 "state": "online", 00:14:07.647 "raid_level": "raid1", 00:14:07.647 "superblock": true, 00:14:07.647 "num_base_bdevs": 2, 00:14:07.647 "num_base_bdevs_discovered": 1, 00:14:07.647 "num_base_bdevs_operational": 1, 00:14:07.647 "base_bdevs_list": [ 00:14:07.647 { 00:14:07.647 "name": null, 00:14:07.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.647 "is_configured": false, 00:14:07.648 "data_offset": 2048, 00:14:07.648 "data_size": 63488 00:14:07.648 }, 00:14:07.648 { 00:14:07.648 "name": "BaseBdev2", 00:14:07.648 "uuid": "ca4548d3-8c93-4d2b-ad36-21675014029c", 00:14:07.648 "is_configured": true, 00:14:07.648 "data_offset": 2048, 00:14:07.648 "data_size": 63488 00:14:07.648 } 00:14:07.648 ] 00:14:07.648 }' 00:14:07.648 11:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.648 11:55:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.214 11:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:08.214 11:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:08.214 11:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:08.214 11:55:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.472 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:08.472 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:08.472 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:08.730 [2024-07-15 11:55:22.242968] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:08.730 [2024-07-15 11:55:22.243054] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:08.730 [2024-07-15 11:55:22.253869] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:08.730 [2024-07-15 11:55:22.253904] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:08.731 [2024-07-15 11:55:22.253915] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11f4150 name Existed_Raid, state offline 00:14:08.731 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:08.731 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:08.731 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.731 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:08.988 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1470121 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1470121 ']' 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1470121 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1470121 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1470121' 00:14:08.989 killing process with pid 1470121 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1470121 00:14:08.989 [2024-07-15 11:55:22.575428] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:08.989 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1470121 00:14:08.989 [2024-07-15 11:55:22.576287] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:09.248 11:55:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:09.248 00:14:09.248 real 0m11.413s 00:14:09.248 user 0m20.321s 00:14:09.248 sys 0m2.107s 00:14:09.248 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:09.248 11:55:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:09.248 ************************************ 00:14:09.248 END TEST raid_state_function_test_sb 00:14:09.248 ************************************ 00:14:09.248 11:55:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:09.248 11:55:22 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:14:09.248 11:55:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:14:09.248 11:55:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:09.248 11:55:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:09.528 ************************************ 00:14:09.528 START TEST raid_superblock_test 00:14:09.528 ************************************ 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1471821 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1471821 /var/tmp/spdk-raid.sock 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1471821 ']' 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:09.528 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:09.528 11:55:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:09.528 [2024-07-15 11:55:22.927947] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:14:09.528 [2024-07-15 11:55:22.928019] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1471821 ] 00:14:09.528 [2024-07-15 11:55:23.055101] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:09.789 [2024-07-15 11:55:23.162394] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:09.789 [2024-07-15 11:55:23.231631] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:09.789 [2024-07-15 11:55:23.231668] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:10.353 11:55:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:10.610 malloc1 00:14:10.610 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:10.868 [2024-07-15 11:55:24.209080] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:10.868 [2024-07-15 11:55:24.209129] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:10.868 [2024-07-15 11:55:24.209150] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bcb560 00:14:10.868 [2024-07-15 11:55:24.209163] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:10.868 [2024-07-15 11:55:24.210780] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:10.868 [2024-07-15 11:55:24.210808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:10.868 pt1 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:10.868 malloc2 00:14:10.868 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:11.125 [2024-07-15 11:55:24.655130] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:11.125 [2024-07-15 11:55:24.655181] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:11.125 [2024-07-15 11:55:24.655200] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c695b0 00:14:11.125 [2024-07-15 11:55:24.655213] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:11.125 [2024-07-15 11:55:24.656849] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:11.125 [2024-07-15 11:55:24.656877] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:11.125 pt2 00:14:11.125 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:11.125 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:11.125 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:14:11.389 [2024-07-15 11:55:24.891784] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:11.389 [2024-07-15 11:55:24.893160] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:11.389 [2024-07-15 11:55:24.893316] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c6adb0 00:14:11.389 [2024-07-15 11:55:24.893329] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:11.389 [2024-07-15 11:55:24.893529] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c6bce0 00:14:11.389 [2024-07-15 11:55:24.893673] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c6adb0 00:14:11.389 [2024-07-15 11:55:24.893683] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c6adb0 00:14:11.389 [2024-07-15 11:55:24.893803] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:11.389 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:11.389 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:11.389 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:11.389 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:11.390 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:11.390 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:11.390 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.390 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.390 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.390 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.390 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.390 11:55:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:11.647 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:11.647 "name": "raid_bdev1", 00:14:11.647 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:11.647 "strip_size_kb": 0, 00:14:11.647 "state": "online", 00:14:11.647 "raid_level": "raid1", 00:14:11.647 "superblock": true, 00:14:11.647 "num_base_bdevs": 2, 00:14:11.647 "num_base_bdevs_discovered": 2, 00:14:11.647 "num_base_bdevs_operational": 2, 00:14:11.647 "base_bdevs_list": [ 00:14:11.647 { 00:14:11.647 "name": "pt1", 00:14:11.647 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:11.647 "is_configured": true, 00:14:11.647 "data_offset": 2048, 00:14:11.647 "data_size": 63488 00:14:11.647 }, 00:14:11.647 { 00:14:11.647 "name": "pt2", 00:14:11.647 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:11.647 "is_configured": true, 00:14:11.647 "data_offset": 2048, 00:14:11.647 "data_size": 63488 00:14:11.647 } 00:14:11.647 ] 00:14:11.647 }' 00:14:11.647 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:11.647 11:55:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:12.577 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:12.577 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:12.577 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:12.577 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:12.577 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:12.577 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:12.577 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:12.577 11:55:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:12.577 [2024-07-15 11:55:26.087171] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:12.577 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:12.577 "name": "raid_bdev1", 00:14:12.577 "aliases": [ 00:14:12.577 "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c" 00:14:12.577 ], 00:14:12.577 "product_name": "Raid Volume", 00:14:12.577 "block_size": 512, 00:14:12.577 "num_blocks": 63488, 00:14:12.577 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:12.577 "assigned_rate_limits": { 00:14:12.577 "rw_ios_per_sec": 0, 00:14:12.577 "rw_mbytes_per_sec": 0, 00:14:12.577 "r_mbytes_per_sec": 0, 00:14:12.577 "w_mbytes_per_sec": 0 00:14:12.577 }, 00:14:12.577 "claimed": false, 00:14:12.577 "zoned": false, 00:14:12.577 "supported_io_types": { 00:14:12.577 "read": true, 00:14:12.577 "write": true, 00:14:12.577 "unmap": false, 00:14:12.577 "flush": false, 00:14:12.577 "reset": true, 00:14:12.577 "nvme_admin": false, 00:14:12.577 "nvme_io": false, 00:14:12.577 "nvme_io_md": false, 00:14:12.577 "write_zeroes": true, 00:14:12.577 "zcopy": false, 00:14:12.577 "get_zone_info": false, 00:14:12.577 "zone_management": false, 00:14:12.577 "zone_append": false, 00:14:12.577 "compare": false, 00:14:12.577 "compare_and_write": false, 00:14:12.577 "abort": false, 00:14:12.577 "seek_hole": false, 00:14:12.577 "seek_data": false, 00:14:12.577 "copy": false, 00:14:12.577 "nvme_iov_md": false 00:14:12.577 }, 00:14:12.577 "memory_domains": [ 00:14:12.577 { 00:14:12.577 "dma_device_id": "system", 00:14:12.577 "dma_device_type": 1 00:14:12.577 }, 00:14:12.577 { 00:14:12.577 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.577 "dma_device_type": 2 00:14:12.577 }, 00:14:12.577 { 00:14:12.577 "dma_device_id": "system", 00:14:12.577 "dma_device_type": 1 00:14:12.577 }, 00:14:12.577 { 00:14:12.577 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.577 "dma_device_type": 2 00:14:12.577 } 00:14:12.577 ], 00:14:12.577 "driver_specific": { 00:14:12.577 "raid": { 00:14:12.577 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:12.577 "strip_size_kb": 0, 00:14:12.577 "state": "online", 00:14:12.577 "raid_level": "raid1", 00:14:12.577 "superblock": true, 00:14:12.577 "num_base_bdevs": 2, 00:14:12.577 "num_base_bdevs_discovered": 2, 00:14:12.577 "num_base_bdevs_operational": 2, 00:14:12.577 "base_bdevs_list": [ 00:14:12.577 { 00:14:12.577 "name": "pt1", 00:14:12.577 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:12.577 "is_configured": true, 00:14:12.577 "data_offset": 2048, 00:14:12.577 "data_size": 63488 00:14:12.577 }, 00:14:12.577 { 00:14:12.577 "name": "pt2", 00:14:12.577 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:12.577 "is_configured": true, 00:14:12.577 "data_offset": 2048, 00:14:12.577 "data_size": 63488 00:14:12.577 } 00:14:12.577 ] 00:14:12.577 } 00:14:12.577 } 00:14:12.577 }' 00:14:12.577 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:12.577 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:12.577 pt2' 00:14:12.577 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.577 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:12.577 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.836 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.836 "name": "pt1", 00:14:12.836 "aliases": [ 00:14:12.836 "00000000-0000-0000-0000-000000000001" 00:14:12.836 ], 00:14:12.836 "product_name": "passthru", 00:14:12.836 "block_size": 512, 00:14:12.836 "num_blocks": 65536, 00:14:12.836 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:12.836 "assigned_rate_limits": { 00:14:12.836 "rw_ios_per_sec": 0, 00:14:12.836 "rw_mbytes_per_sec": 0, 00:14:12.836 "r_mbytes_per_sec": 0, 00:14:12.836 "w_mbytes_per_sec": 0 00:14:12.836 }, 00:14:12.836 "claimed": true, 00:14:12.836 "claim_type": "exclusive_write", 00:14:12.836 "zoned": false, 00:14:12.836 "supported_io_types": { 00:14:12.836 "read": true, 00:14:12.836 "write": true, 00:14:12.836 "unmap": true, 00:14:12.836 "flush": true, 00:14:12.836 "reset": true, 00:14:12.836 "nvme_admin": false, 00:14:12.836 "nvme_io": false, 00:14:12.836 "nvme_io_md": false, 00:14:12.836 "write_zeroes": true, 00:14:12.836 "zcopy": true, 00:14:12.836 "get_zone_info": false, 00:14:12.836 "zone_management": false, 00:14:12.836 "zone_append": false, 00:14:12.836 "compare": false, 00:14:12.836 "compare_and_write": false, 00:14:12.836 "abort": true, 00:14:12.836 "seek_hole": false, 00:14:12.836 "seek_data": false, 00:14:12.836 "copy": true, 00:14:12.836 "nvme_iov_md": false 00:14:12.836 }, 00:14:12.836 "memory_domains": [ 00:14:12.836 { 00:14:12.836 "dma_device_id": "system", 00:14:12.836 "dma_device_type": 1 00:14:12.836 }, 00:14:12.836 { 00:14:12.836 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.836 "dma_device_type": 2 00:14:12.836 } 00:14:12.836 ], 00:14:12.836 "driver_specific": { 00:14:12.836 "passthru": { 00:14:12.836 "name": "pt1", 00:14:12.836 "base_bdev_name": "malloc1" 00:14:12.836 } 00:14:12.836 } 00:14:12.836 }' 00:14:12.836 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.093 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.093 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.093 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.093 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.093 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.093 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.093 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.093 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.093 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.352 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.352 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.352 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:13.352 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:13.352 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:13.610 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:13.610 "name": "pt2", 00:14:13.610 "aliases": [ 00:14:13.610 "00000000-0000-0000-0000-000000000002" 00:14:13.610 ], 00:14:13.610 "product_name": "passthru", 00:14:13.610 "block_size": 512, 00:14:13.610 "num_blocks": 65536, 00:14:13.610 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:13.610 "assigned_rate_limits": { 00:14:13.610 "rw_ios_per_sec": 0, 00:14:13.610 "rw_mbytes_per_sec": 0, 00:14:13.610 "r_mbytes_per_sec": 0, 00:14:13.610 "w_mbytes_per_sec": 0 00:14:13.610 }, 00:14:13.610 "claimed": true, 00:14:13.610 "claim_type": "exclusive_write", 00:14:13.610 "zoned": false, 00:14:13.610 "supported_io_types": { 00:14:13.610 "read": true, 00:14:13.610 "write": true, 00:14:13.610 "unmap": true, 00:14:13.610 "flush": true, 00:14:13.610 "reset": true, 00:14:13.610 "nvme_admin": false, 00:14:13.610 "nvme_io": false, 00:14:13.610 "nvme_io_md": false, 00:14:13.610 "write_zeroes": true, 00:14:13.610 "zcopy": true, 00:14:13.610 "get_zone_info": false, 00:14:13.610 "zone_management": false, 00:14:13.610 "zone_append": false, 00:14:13.610 "compare": false, 00:14:13.610 "compare_and_write": false, 00:14:13.610 "abort": true, 00:14:13.610 "seek_hole": false, 00:14:13.610 "seek_data": false, 00:14:13.610 "copy": true, 00:14:13.610 "nvme_iov_md": false 00:14:13.610 }, 00:14:13.610 "memory_domains": [ 00:14:13.610 { 00:14:13.610 "dma_device_id": "system", 00:14:13.610 "dma_device_type": 1 00:14:13.610 }, 00:14:13.610 { 00:14:13.610 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:13.610 "dma_device_type": 2 00:14:13.610 } 00:14:13.610 ], 00:14:13.610 "driver_specific": { 00:14:13.610 "passthru": { 00:14:13.610 "name": "pt2", 00:14:13.610 "base_bdev_name": "malloc2" 00:14:13.610 } 00:14:13.610 } 00:14:13.610 }' 00:14:13.610 11:55:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.610 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:13.610 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:13.610 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.610 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:13.610 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:13.610 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.869 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:13.869 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:13.869 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.869 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:13.869 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:13.869 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:13.869 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:14.126 [2024-07-15 11:55:27.587136] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:14.127 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=9459d0ad-6da1-47f4-bb3d-e2eef434fd1c 00:14:14.127 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 9459d0ad-6da1-47f4-bb3d-e2eef434fd1c ']' 00:14:14.127 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:14.385 [2024-07-15 11:55:27.835535] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:14.385 [2024-07-15 11:55:27.835561] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:14.385 [2024-07-15 11:55:27.835622] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:14.385 [2024-07-15 11:55:27.835677] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:14.385 [2024-07-15 11:55:27.835697] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c6adb0 name raid_bdev1, state offline 00:14:14.385 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.385 11:55:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:14.643 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:14.643 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:14.643 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:14.643 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:14.901 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:14.901 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:15.158 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:15.159 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:15.418 11:55:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:14:15.677 [2024-07-15 11:55:29.114860] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:15.677 [2024-07-15 11:55:29.116227] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:15.677 [2024-07-15 11:55:29.116280] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:15.677 [2024-07-15 11:55:29.116320] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:15.677 [2024-07-15 11:55:29.116340] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:15.677 [2024-07-15 11:55:29.116349] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c6c030 name raid_bdev1, state configuring 00:14:15.677 request: 00:14:15.677 { 00:14:15.677 "name": "raid_bdev1", 00:14:15.677 "raid_level": "raid1", 00:14:15.677 "base_bdevs": [ 00:14:15.677 "malloc1", 00:14:15.677 "malloc2" 00:14:15.677 ], 00:14:15.677 "superblock": false, 00:14:15.677 "method": "bdev_raid_create", 00:14:15.677 "req_id": 1 00:14:15.677 } 00:14:15.677 Got JSON-RPC error response 00:14:15.677 response: 00:14:15.677 { 00:14:15.677 "code": -17, 00:14:15.677 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:15.677 } 00:14:15.677 11:55:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:14:15.677 11:55:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:14:15.677 11:55:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:14:15.677 11:55:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:14:15.677 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.677 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:15.934 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:15.934 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:15.934 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:16.210 [2024-07-15 11:55:29.600093] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:16.210 [2024-07-15 11:55:29.600143] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:16.210 [2024-07-15 11:55:29.600161] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c6bae0 00:14:16.210 [2024-07-15 11:55:29.600174] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:16.210 [2024-07-15 11:55:29.601862] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:16.210 [2024-07-15 11:55:29.601889] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:16.210 [2024-07-15 11:55:29.601957] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:16.210 [2024-07-15 11:55:29.601982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:16.210 pt1 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.210 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:16.491 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.491 "name": "raid_bdev1", 00:14:16.491 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:16.491 "strip_size_kb": 0, 00:14:16.491 "state": "configuring", 00:14:16.491 "raid_level": "raid1", 00:14:16.491 "superblock": true, 00:14:16.491 "num_base_bdevs": 2, 00:14:16.491 "num_base_bdevs_discovered": 1, 00:14:16.491 "num_base_bdevs_operational": 2, 00:14:16.491 "base_bdevs_list": [ 00:14:16.491 { 00:14:16.491 "name": "pt1", 00:14:16.491 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:16.491 "is_configured": true, 00:14:16.491 "data_offset": 2048, 00:14:16.491 "data_size": 63488 00:14:16.491 }, 00:14:16.491 { 00:14:16.491 "name": null, 00:14:16.491 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:16.491 "is_configured": false, 00:14:16.491 "data_offset": 2048, 00:14:16.491 "data_size": 63488 00:14:16.491 } 00:14:16.491 ] 00:14:16.491 }' 00:14:16.491 11:55:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.491 11:55:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.067 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:14:17.067 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:17.067 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:17.067 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:17.325 [2024-07-15 11:55:30.695004] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:17.325 [2024-07-15 11:55:30.695061] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:17.325 [2024-07-15 11:55:30.695082] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bca940 00:14:17.325 [2024-07-15 11:55:30.695094] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:17.325 [2024-07-15 11:55:30.695439] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:17.325 [2024-07-15 11:55:30.695456] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:17.325 [2024-07-15 11:55:30.695518] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:17.325 [2024-07-15 11:55:30.695538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:17.325 [2024-07-15 11:55:30.695636] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bcd2c0 00:14:17.325 [2024-07-15 11:55:30.695646] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:17.325 [2024-07-15 11:55:30.695818] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bce100 00:14:17.325 [2024-07-15 11:55:30.695941] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bcd2c0 00:14:17.325 [2024-07-15 11:55:30.695951] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bcd2c0 00:14:17.325 [2024-07-15 11:55:30.696048] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:17.325 pt2 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:17.325 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:17.326 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:17.326 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:17.326 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.326 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:17.326 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.326 "name": "raid_bdev1", 00:14:17.326 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:17.326 "strip_size_kb": 0, 00:14:17.326 "state": "online", 00:14:17.326 "raid_level": "raid1", 00:14:17.326 "superblock": true, 00:14:17.326 "num_base_bdevs": 2, 00:14:17.326 "num_base_bdevs_discovered": 2, 00:14:17.326 "num_base_bdevs_operational": 2, 00:14:17.326 "base_bdevs_list": [ 00:14:17.326 { 00:14:17.326 "name": "pt1", 00:14:17.326 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:17.326 "is_configured": true, 00:14:17.326 "data_offset": 2048, 00:14:17.326 "data_size": 63488 00:14:17.326 }, 00:14:17.326 { 00:14:17.326 "name": "pt2", 00:14:17.326 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:17.326 "is_configured": true, 00:14:17.326 "data_offset": 2048, 00:14:17.326 "data_size": 63488 00:14:17.326 } 00:14:17.326 ] 00:14:17.326 }' 00:14:17.326 11:55:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.326 11:55:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:18.261 [2024-07-15 11:55:31.725968] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:18.261 "name": "raid_bdev1", 00:14:18.261 "aliases": [ 00:14:18.261 "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c" 00:14:18.261 ], 00:14:18.261 "product_name": "Raid Volume", 00:14:18.261 "block_size": 512, 00:14:18.261 "num_blocks": 63488, 00:14:18.261 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:18.261 "assigned_rate_limits": { 00:14:18.261 "rw_ios_per_sec": 0, 00:14:18.261 "rw_mbytes_per_sec": 0, 00:14:18.261 "r_mbytes_per_sec": 0, 00:14:18.261 "w_mbytes_per_sec": 0 00:14:18.261 }, 00:14:18.261 "claimed": false, 00:14:18.261 "zoned": false, 00:14:18.261 "supported_io_types": { 00:14:18.261 "read": true, 00:14:18.261 "write": true, 00:14:18.261 "unmap": false, 00:14:18.261 "flush": false, 00:14:18.261 "reset": true, 00:14:18.261 "nvme_admin": false, 00:14:18.261 "nvme_io": false, 00:14:18.261 "nvme_io_md": false, 00:14:18.261 "write_zeroes": true, 00:14:18.261 "zcopy": false, 00:14:18.261 "get_zone_info": false, 00:14:18.261 "zone_management": false, 00:14:18.261 "zone_append": false, 00:14:18.261 "compare": false, 00:14:18.261 "compare_and_write": false, 00:14:18.261 "abort": false, 00:14:18.261 "seek_hole": false, 00:14:18.261 "seek_data": false, 00:14:18.261 "copy": false, 00:14:18.261 "nvme_iov_md": false 00:14:18.261 }, 00:14:18.261 "memory_domains": [ 00:14:18.261 { 00:14:18.261 "dma_device_id": "system", 00:14:18.261 "dma_device_type": 1 00:14:18.261 }, 00:14:18.261 { 00:14:18.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.261 "dma_device_type": 2 00:14:18.261 }, 00:14:18.261 { 00:14:18.261 "dma_device_id": "system", 00:14:18.261 "dma_device_type": 1 00:14:18.261 }, 00:14:18.261 { 00:14:18.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.261 "dma_device_type": 2 00:14:18.261 } 00:14:18.261 ], 00:14:18.261 "driver_specific": { 00:14:18.261 "raid": { 00:14:18.261 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:18.261 "strip_size_kb": 0, 00:14:18.261 "state": "online", 00:14:18.261 "raid_level": "raid1", 00:14:18.261 "superblock": true, 00:14:18.261 "num_base_bdevs": 2, 00:14:18.261 "num_base_bdevs_discovered": 2, 00:14:18.261 "num_base_bdevs_operational": 2, 00:14:18.261 "base_bdevs_list": [ 00:14:18.261 { 00:14:18.261 "name": "pt1", 00:14:18.261 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:18.261 "is_configured": true, 00:14:18.261 "data_offset": 2048, 00:14:18.261 "data_size": 63488 00:14:18.261 }, 00:14:18.261 { 00:14:18.261 "name": "pt2", 00:14:18.261 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:18.261 "is_configured": true, 00:14:18.261 "data_offset": 2048, 00:14:18.261 "data_size": 63488 00:14:18.261 } 00:14:18.261 ] 00:14:18.261 } 00:14:18.261 } 00:14:18.261 }' 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:18.261 pt2' 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.261 11:55:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:18.519 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.519 "name": "pt1", 00:14:18.519 "aliases": [ 00:14:18.519 "00000000-0000-0000-0000-000000000001" 00:14:18.519 ], 00:14:18.519 "product_name": "passthru", 00:14:18.519 "block_size": 512, 00:14:18.519 "num_blocks": 65536, 00:14:18.519 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:18.519 "assigned_rate_limits": { 00:14:18.519 "rw_ios_per_sec": 0, 00:14:18.519 "rw_mbytes_per_sec": 0, 00:14:18.519 "r_mbytes_per_sec": 0, 00:14:18.519 "w_mbytes_per_sec": 0 00:14:18.519 }, 00:14:18.519 "claimed": true, 00:14:18.520 "claim_type": "exclusive_write", 00:14:18.520 "zoned": false, 00:14:18.520 "supported_io_types": { 00:14:18.520 "read": true, 00:14:18.520 "write": true, 00:14:18.520 "unmap": true, 00:14:18.520 "flush": true, 00:14:18.520 "reset": true, 00:14:18.520 "nvme_admin": false, 00:14:18.520 "nvme_io": false, 00:14:18.520 "nvme_io_md": false, 00:14:18.520 "write_zeroes": true, 00:14:18.520 "zcopy": true, 00:14:18.520 "get_zone_info": false, 00:14:18.520 "zone_management": false, 00:14:18.520 "zone_append": false, 00:14:18.520 "compare": false, 00:14:18.520 "compare_and_write": false, 00:14:18.520 "abort": true, 00:14:18.520 "seek_hole": false, 00:14:18.520 "seek_data": false, 00:14:18.520 "copy": true, 00:14:18.520 "nvme_iov_md": false 00:14:18.520 }, 00:14:18.520 "memory_domains": [ 00:14:18.520 { 00:14:18.520 "dma_device_id": "system", 00:14:18.520 "dma_device_type": 1 00:14:18.520 }, 00:14:18.520 { 00:14:18.520 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.520 "dma_device_type": 2 00:14:18.520 } 00:14:18.520 ], 00:14:18.520 "driver_specific": { 00:14:18.520 "passthru": { 00:14:18.520 "name": "pt1", 00:14:18.520 "base_bdev_name": "malloc1" 00:14:18.520 } 00:14:18.520 } 00:14:18.520 }' 00:14:18.520 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.520 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.779 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.779 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.779 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.779 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.779 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.779 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.037 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:19.037 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.037 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.037 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:19.037 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:19.037 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:19.037 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:19.605 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:19.605 "name": "pt2", 00:14:19.605 "aliases": [ 00:14:19.605 "00000000-0000-0000-0000-000000000002" 00:14:19.605 ], 00:14:19.605 "product_name": "passthru", 00:14:19.605 "block_size": 512, 00:14:19.605 "num_blocks": 65536, 00:14:19.605 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:19.605 "assigned_rate_limits": { 00:14:19.605 "rw_ios_per_sec": 0, 00:14:19.605 "rw_mbytes_per_sec": 0, 00:14:19.605 "r_mbytes_per_sec": 0, 00:14:19.605 "w_mbytes_per_sec": 0 00:14:19.605 }, 00:14:19.605 "claimed": true, 00:14:19.605 "claim_type": "exclusive_write", 00:14:19.605 "zoned": false, 00:14:19.605 "supported_io_types": { 00:14:19.605 "read": true, 00:14:19.605 "write": true, 00:14:19.605 "unmap": true, 00:14:19.605 "flush": true, 00:14:19.605 "reset": true, 00:14:19.605 "nvme_admin": false, 00:14:19.605 "nvme_io": false, 00:14:19.605 "nvme_io_md": false, 00:14:19.605 "write_zeroes": true, 00:14:19.605 "zcopy": true, 00:14:19.605 "get_zone_info": false, 00:14:19.605 "zone_management": false, 00:14:19.605 "zone_append": false, 00:14:19.605 "compare": false, 00:14:19.605 "compare_and_write": false, 00:14:19.605 "abort": true, 00:14:19.605 "seek_hole": false, 00:14:19.605 "seek_data": false, 00:14:19.605 "copy": true, 00:14:19.605 "nvme_iov_md": false 00:14:19.605 }, 00:14:19.605 "memory_domains": [ 00:14:19.605 { 00:14:19.605 "dma_device_id": "system", 00:14:19.605 "dma_device_type": 1 00:14:19.605 }, 00:14:19.605 { 00:14:19.605 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.605 "dma_device_type": 2 00:14:19.605 } 00:14:19.605 ], 00:14:19.605 "driver_specific": { 00:14:19.605 "passthru": { 00:14:19.605 "name": "pt2", 00:14:19.605 "base_bdev_name": "malloc2" 00:14:19.605 } 00:14:19.605 } 00:14:19.605 }' 00:14:19.605 11:55:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.605 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.605 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:19.605 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.605 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.605 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:19.605 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.864 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.864 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:19.864 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.864 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.864 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:19.864 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:19.864 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:20.122 [2024-07-15 11:55:33.574932] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:20.122 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 9459d0ad-6da1-47f4-bb3d-e2eef434fd1c '!=' 9459d0ad-6da1-47f4-bb3d-e2eef434fd1c ']' 00:14:20.122 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:14:20.122 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:20.122 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:20.122 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:20.379 [2024-07-15 11:55:33.819334] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:20.379 11:55:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:20.638 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:20.638 "name": "raid_bdev1", 00:14:20.638 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:20.638 "strip_size_kb": 0, 00:14:20.638 "state": "online", 00:14:20.638 "raid_level": "raid1", 00:14:20.638 "superblock": true, 00:14:20.638 "num_base_bdevs": 2, 00:14:20.638 "num_base_bdevs_discovered": 1, 00:14:20.638 "num_base_bdevs_operational": 1, 00:14:20.638 "base_bdevs_list": [ 00:14:20.638 { 00:14:20.638 "name": null, 00:14:20.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:20.638 "is_configured": false, 00:14:20.638 "data_offset": 2048, 00:14:20.638 "data_size": 63488 00:14:20.638 }, 00:14:20.638 { 00:14:20.638 "name": "pt2", 00:14:20.638 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:20.638 "is_configured": true, 00:14:20.638 "data_offset": 2048, 00:14:20.638 "data_size": 63488 00:14:20.638 } 00:14:20.638 ] 00:14:20.638 }' 00:14:20.638 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:20.638 11:55:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:21.572 11:55:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:21.572 [2024-07-15 11:55:35.030524] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:21.572 [2024-07-15 11:55:35.030552] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:21.572 [2024-07-15 11:55:35.030607] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:21.572 [2024-07-15 11:55:35.030650] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:21.572 [2024-07-15 11:55:35.030662] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bcd2c0 name raid_bdev1, state offline 00:14:21.572 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:21.572 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:14:21.830 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:14:21.830 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:14:21.830 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:14:21.830 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:21.830 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:22.089 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:14:22.089 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:14:22.089 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:14:22.089 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:14:22.089 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:14:22.089 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:22.348 [2024-07-15 11:55:35.740361] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:22.348 [2024-07-15 11:55:35.740404] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:22.348 [2024-07-15 11:55:35.740421] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bca5a0 00:14:22.348 [2024-07-15 11:55:35.740433] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:22.348 [2024-07-15 11:55:35.742129] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:22.348 [2024-07-15 11:55:35.742158] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:22.348 [2024-07-15 11:55:35.742225] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:22.348 [2024-07-15 11:55:35.742251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:22.348 [2024-07-15 11:55:35.742334] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bcde10 00:14:22.348 [2024-07-15 11:55:35.742351] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:22.348 [2024-07-15 11:55:35.742522] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c6c950 00:14:22.348 [2024-07-15 11:55:35.742640] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bcde10 00:14:22.348 [2024-07-15 11:55:35.742650] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bcde10 00:14:22.348 [2024-07-15 11:55:35.742753] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:22.348 pt2 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.348 11:55:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:22.916 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.916 "name": "raid_bdev1", 00:14:22.916 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:22.916 "strip_size_kb": 0, 00:14:22.916 "state": "online", 00:14:22.916 "raid_level": "raid1", 00:14:22.916 "superblock": true, 00:14:22.916 "num_base_bdevs": 2, 00:14:22.916 "num_base_bdevs_discovered": 1, 00:14:22.916 "num_base_bdevs_operational": 1, 00:14:22.916 "base_bdevs_list": [ 00:14:22.916 { 00:14:22.916 "name": null, 00:14:22.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:22.916 "is_configured": false, 00:14:22.916 "data_offset": 2048, 00:14:22.916 "data_size": 63488 00:14:22.916 }, 00:14:22.916 { 00:14:22.916 "name": "pt2", 00:14:22.916 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:22.916 "is_configured": true, 00:14:22.916 "data_offset": 2048, 00:14:22.916 "data_size": 63488 00:14:22.916 } 00:14:22.916 ] 00:14:22.916 }' 00:14:22.916 11:55:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.916 11:55:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.484 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:23.743 [2024-07-15 11:55:37.232299] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:23.743 [2024-07-15 11:55:37.232326] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:23.743 [2024-07-15 11:55:37.232382] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:23.743 [2024-07-15 11:55:37.232428] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:23.743 [2024-07-15 11:55:37.232439] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bcde10 name raid_bdev1, state offline 00:14:23.743 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:23.743 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:14:24.003 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:14:24.003 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:14:24.003 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:14:24.003 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:24.262 [2024-07-15 11:55:37.737608] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:24.262 [2024-07-15 11:55:37.737656] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:24.262 [2024-07-15 11:55:37.737673] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c697e0 00:14:24.262 [2024-07-15 11:55:37.737690] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:24.262 [2024-07-15 11:55:37.739333] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:24.262 [2024-07-15 11:55:37.739359] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:24.262 [2024-07-15 11:55:37.739421] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:24.262 [2024-07-15 11:55:37.739446] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:24.262 [2024-07-15 11:55:37.739546] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:14:24.262 [2024-07-15 11:55:37.739559] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:24.262 [2024-07-15 11:55:37.739571] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c6ea00 name raid_bdev1, state configuring 00:14:24.262 [2024-07-15 11:55:37.739594] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:24.262 [2024-07-15 11:55:37.739647] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c6d760 00:14:24.262 [2024-07-15 11:55:37.739657] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:24.262 [2024-07-15 11:55:37.739827] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c6ef30 00:14:24.262 [2024-07-15 11:55:37.739946] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c6d760 00:14:24.262 [2024-07-15 11:55:37.739956] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c6d760 00:14:24.262 [2024-07-15 11:55:37.740054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:24.262 pt1 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:24.262 11:55:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:24.521 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:24.521 "name": "raid_bdev1", 00:14:24.521 "uuid": "9459d0ad-6da1-47f4-bb3d-e2eef434fd1c", 00:14:24.521 "strip_size_kb": 0, 00:14:24.521 "state": "online", 00:14:24.521 "raid_level": "raid1", 00:14:24.521 "superblock": true, 00:14:24.521 "num_base_bdevs": 2, 00:14:24.521 "num_base_bdevs_discovered": 1, 00:14:24.521 "num_base_bdevs_operational": 1, 00:14:24.521 "base_bdevs_list": [ 00:14:24.521 { 00:14:24.521 "name": null, 00:14:24.521 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:24.521 "is_configured": false, 00:14:24.521 "data_offset": 2048, 00:14:24.521 "data_size": 63488 00:14:24.521 }, 00:14:24.521 { 00:14:24.521 "name": "pt2", 00:14:24.521 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:24.521 "is_configured": true, 00:14:24.521 "data_offset": 2048, 00:14:24.521 "data_size": 63488 00:14:24.521 } 00:14:24.521 ] 00:14:24.521 }' 00:14:24.521 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:24.521 11:55:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.090 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:14:25.090 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:14:25.349 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:14:25.349 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:25.349 11:55:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:14:25.608 [2024-07-15 11:55:39.097436] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 9459d0ad-6da1-47f4-bb3d-e2eef434fd1c '!=' 9459d0ad-6da1-47f4-bb3d-e2eef434fd1c ']' 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1471821 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1471821 ']' 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1471821 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1471821 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1471821' 00:14:25.608 killing process with pid 1471821 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1471821 00:14:25.608 [2024-07-15 11:55:39.166146] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:25.608 [2024-07-15 11:55:39.166204] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:25.608 [2024-07-15 11:55:39.166249] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:25.608 [2024-07-15 11:55:39.166260] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c6d760 name raid_bdev1, state offline 00:14:25.608 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1471821 00:14:25.608 [2024-07-15 11:55:39.185431] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:25.867 11:55:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:25.867 00:14:25.867 real 0m16.548s 00:14:25.867 user 0m30.069s 00:14:25.867 sys 0m2.936s 00:14:25.867 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:25.867 11:55:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:25.867 ************************************ 00:14:25.867 END TEST raid_superblock_test 00:14:25.867 ************************************ 00:14:25.867 11:55:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:25.867 11:55:39 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:14:25.867 11:55:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:25.867 11:55:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:25.867 11:55:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:26.126 ************************************ 00:14:26.126 START TEST raid_read_error_test 00:14:26.126 ************************************ 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 read 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.yPjG1Pk1Is 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1474259 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1474259 /var/tmp/spdk-raid.sock 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1474259 ']' 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:26.126 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:26.126 11:55:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.126 [2024-07-15 11:55:39.567508] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:14:26.126 [2024-07-15 11:55:39.567575] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1474259 ] 00:14:26.126 [2024-07-15 11:55:39.695862] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:26.384 [2024-07-15 11:55:39.798030] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:26.384 [2024-07-15 11:55:39.855892] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:26.384 [2024-07-15 11:55:39.855925] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:26.950 11:55:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:26.950 11:55:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:26.950 11:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:26.950 11:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:27.208 BaseBdev1_malloc 00:14:27.208 11:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:27.466 true 00:14:27.466 11:55:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:27.723 [2024-07-15 11:55:41.196380] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:27.723 [2024-07-15 11:55:41.196424] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:27.723 [2024-07-15 11:55:41.196443] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2a4e0 00:14:27.723 [2024-07-15 11:55:41.196455] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:27.723 [2024-07-15 11:55:41.198194] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:27.723 [2024-07-15 11:55:41.198221] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:27.723 BaseBdev1 00:14:27.723 11:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:27.723 11:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:27.981 BaseBdev2_malloc 00:14:27.981 11:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:28.240 true 00:14:28.240 11:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:28.498 [2024-07-15 11:55:41.931001] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:28.498 [2024-07-15 11:55:41.931046] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:28.498 [2024-07-15 11:55:41.931066] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb2f7b0 00:14:28.498 [2024-07-15 11:55:41.931078] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:28.498 [2024-07-15 11:55:41.932666] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:28.498 [2024-07-15 11:55:41.932699] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:28.498 BaseBdev2 00:14:28.498 11:55:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:14:28.756 [2024-07-15 11:55:42.171659] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:28.756 [2024-07-15 11:55:42.173014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:28.756 [2024-07-15 11:55:42.173209] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb30e10 00:14:28.756 [2024-07-15 11:55:42.173223] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:28.756 [2024-07-15 11:55:42.173418] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9854c0 00:14:28.756 [2024-07-15 11:55:42.173569] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb30e10 00:14:28.756 [2024-07-15 11:55:42.173579] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb30e10 00:14:28.756 [2024-07-15 11:55:42.173693] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:28.756 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:28.757 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:29.014 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:29.014 "name": "raid_bdev1", 00:14:29.014 "uuid": "cfe54e8b-6f55-478c-80fd-821f34cdd451", 00:14:29.014 "strip_size_kb": 0, 00:14:29.014 "state": "online", 00:14:29.014 "raid_level": "raid1", 00:14:29.014 "superblock": true, 00:14:29.014 "num_base_bdevs": 2, 00:14:29.014 "num_base_bdevs_discovered": 2, 00:14:29.014 "num_base_bdevs_operational": 2, 00:14:29.014 "base_bdevs_list": [ 00:14:29.014 { 00:14:29.014 "name": "BaseBdev1", 00:14:29.014 "uuid": "641d2fe1-ff47-58eb-a424-210738088e66", 00:14:29.014 "is_configured": true, 00:14:29.014 "data_offset": 2048, 00:14:29.014 "data_size": 63488 00:14:29.014 }, 00:14:29.014 { 00:14:29.014 "name": "BaseBdev2", 00:14:29.014 "uuid": "423a4623-e703-5d49-b278-c0e6a6fbc442", 00:14:29.014 "is_configured": true, 00:14:29.014 "data_offset": 2048, 00:14:29.014 "data_size": 63488 00:14:29.014 } 00:14:29.014 ] 00:14:29.014 }' 00:14:29.014 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:29.014 11:55:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.581 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:29.581 11:55:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:29.581 [2024-07-15 11:55:43.102417] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb2bef0 00:14:30.518 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:30.777 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.036 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:31.036 "name": "raid_bdev1", 00:14:31.036 "uuid": "cfe54e8b-6f55-478c-80fd-821f34cdd451", 00:14:31.036 "strip_size_kb": 0, 00:14:31.036 "state": "online", 00:14:31.036 "raid_level": "raid1", 00:14:31.036 "superblock": true, 00:14:31.036 "num_base_bdevs": 2, 00:14:31.036 "num_base_bdevs_discovered": 2, 00:14:31.036 "num_base_bdevs_operational": 2, 00:14:31.036 "base_bdevs_list": [ 00:14:31.036 { 00:14:31.036 "name": "BaseBdev1", 00:14:31.036 "uuid": "641d2fe1-ff47-58eb-a424-210738088e66", 00:14:31.036 "is_configured": true, 00:14:31.036 "data_offset": 2048, 00:14:31.036 "data_size": 63488 00:14:31.036 }, 00:14:31.036 { 00:14:31.036 "name": "BaseBdev2", 00:14:31.036 "uuid": "423a4623-e703-5d49-b278-c0e6a6fbc442", 00:14:31.036 "is_configured": true, 00:14:31.036 "data_offset": 2048, 00:14:31.036 "data_size": 63488 00:14:31.036 } 00:14:31.036 ] 00:14:31.036 }' 00:14:31.036 11:55:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:31.036 11:55:44 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.604 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:31.864 [2024-07-15 11:55:45.277719] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:31.864 [2024-07-15 11:55:45.277756] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:31.864 [2024-07-15 11:55:45.280932] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:31.864 [2024-07-15 11:55:45.280970] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:31.864 [2024-07-15 11:55:45.281047] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:31.864 [2024-07-15 11:55:45.281059] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb30e10 name raid_bdev1, state offline 00:14:31.864 0 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1474259 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1474259 ']' 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1474259 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1474259 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1474259' 00:14:31.864 killing process with pid 1474259 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1474259 00:14:31.864 [2024-07-15 11:55:45.364261] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:31.864 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1474259 00:14:31.864 [2024-07-15 11:55:45.375127] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.yPjG1Pk1Is 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:32.124 00:14:32.124 real 0m6.122s 00:14:32.124 user 0m9.559s 00:14:32.124 sys 0m1.071s 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:32.124 11:55:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.124 ************************************ 00:14:32.124 END TEST raid_read_error_test 00:14:32.124 ************************************ 00:14:32.124 11:55:45 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:32.124 11:55:45 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:14:32.124 11:55:45 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:32.124 11:55:45 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:32.124 11:55:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:32.124 ************************************ 00:14:32.124 START TEST raid_write_error_test 00:14:32.124 ************************************ 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 2 write 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.39ZHsr8tIs 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1475226 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1475226 /var/tmp/spdk-raid.sock 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1475226 ']' 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:32.124 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:32.124 11:55:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:32.384 [2024-07-15 11:55:45.774192] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:14:32.384 [2024-07-15 11:55:45.774258] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1475226 ] 00:14:32.384 [2024-07-15 11:55:45.904175] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:32.643 [2024-07-15 11:55:46.010263] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:32.643 [2024-07-15 11:55:46.078514] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:32.643 [2024-07-15 11:55:46.078557] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:33.210 11:55:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:33.210 11:55:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:14:33.210 11:55:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:33.210 11:55:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:33.469 BaseBdev1_malloc 00:14:33.469 11:55:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:33.728 true 00:14:33.728 11:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:33.986 [2024-07-15 11:55:47.328468] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:33.986 [2024-07-15 11:55:47.328512] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:33.986 [2024-07-15 11:55:47.328532] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12a64e0 00:14:33.986 [2024-07-15 11:55:47.328544] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:33.987 [2024-07-15 11:55:47.330208] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:33.987 [2024-07-15 11:55:47.330234] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:33.987 BaseBdev1 00:14:33.987 11:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:33.987 11:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:34.245 BaseBdev2_malloc 00:14:34.245 11:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:34.504 true 00:14:34.504 11:55:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:34.504 [2024-07-15 11:55:48.095143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:34.504 [2024-07-15 11:55:48.095186] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:34.504 [2024-07-15 11:55:48.095205] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12ab7b0 00:14:34.504 [2024-07-15 11:55:48.095218] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:34.504 [2024-07-15 11:55:48.096813] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:34.504 [2024-07-15 11:55:48.096841] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:34.504 BaseBdev2 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:14:34.764 [2024-07-15 11:55:48.327785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:34.764 [2024-07-15 11:55:48.329133] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:34.764 [2024-07-15 11:55:48.329325] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12ace10 00:14:34.764 [2024-07-15 11:55:48.329338] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:14:34.764 [2024-07-15 11:55:48.329536] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11014c0 00:14:34.764 [2024-07-15 11:55:48.329699] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12ace10 00:14:34.764 [2024-07-15 11:55:48.329710] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12ace10 00:14:34.764 [2024-07-15 11:55:48.329818] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.764 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.765 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.765 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.765 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:35.024 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:35.024 "name": "raid_bdev1", 00:14:35.024 "uuid": "416d446a-7acf-4545-98f4-3431247a0a4f", 00:14:35.024 "strip_size_kb": 0, 00:14:35.024 "state": "online", 00:14:35.024 "raid_level": "raid1", 00:14:35.024 "superblock": true, 00:14:35.024 "num_base_bdevs": 2, 00:14:35.024 "num_base_bdevs_discovered": 2, 00:14:35.024 "num_base_bdevs_operational": 2, 00:14:35.024 "base_bdevs_list": [ 00:14:35.024 { 00:14:35.024 "name": "BaseBdev1", 00:14:35.024 "uuid": "416b49b9-917f-56af-a2a1-72e22efd7fca", 00:14:35.024 "is_configured": true, 00:14:35.024 "data_offset": 2048, 00:14:35.024 "data_size": 63488 00:14:35.024 }, 00:14:35.024 { 00:14:35.024 "name": "BaseBdev2", 00:14:35.024 "uuid": "c71077bf-2d2d-5288-9e14-70f99a44d49e", 00:14:35.024 "is_configured": true, 00:14:35.024 "data_offset": 2048, 00:14:35.024 "data_size": 63488 00:14:35.024 } 00:14:35.024 ] 00:14:35.024 }' 00:14:35.024 11:55:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:35.024 11:55:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.961 11:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:35.961 11:55:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:35.961 [2024-07-15 11:55:49.298613] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a7ef0 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:36.899 [2024-07-15 11:55:50.421646] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:14:36.899 [2024-07-15 11:55:50.421716] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:36.899 [2024-07-15 11:55:50.421895] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x12a7ef0 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.899 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:37.158 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.158 "name": "raid_bdev1", 00:14:37.158 "uuid": "416d446a-7acf-4545-98f4-3431247a0a4f", 00:14:37.158 "strip_size_kb": 0, 00:14:37.158 "state": "online", 00:14:37.158 "raid_level": "raid1", 00:14:37.158 "superblock": true, 00:14:37.158 "num_base_bdevs": 2, 00:14:37.158 "num_base_bdevs_discovered": 1, 00:14:37.158 "num_base_bdevs_operational": 1, 00:14:37.158 "base_bdevs_list": [ 00:14:37.158 { 00:14:37.158 "name": null, 00:14:37.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.159 "is_configured": false, 00:14:37.159 "data_offset": 2048, 00:14:37.159 "data_size": 63488 00:14:37.159 }, 00:14:37.159 { 00:14:37.159 "name": "BaseBdev2", 00:14:37.159 "uuid": "c71077bf-2d2d-5288-9e14-70f99a44d49e", 00:14:37.159 "is_configured": true, 00:14:37.159 "data_offset": 2048, 00:14:37.159 "data_size": 63488 00:14:37.159 } 00:14:37.159 ] 00:14:37.159 }' 00:14:37.159 11:55:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.159 11:55:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:37.749 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:38.008 [2024-07-15 11:55:51.533229] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:38.008 [2024-07-15 11:55:51.533264] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:38.008 [2024-07-15 11:55:51.536560] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:38.008 [2024-07-15 11:55:51.536593] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:38.008 [2024-07-15 11:55:51.536641] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:38.008 [2024-07-15 11:55:51.536652] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ace10 name raid_bdev1, state offline 00:14:38.008 0 00:14:38.008 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1475226 00:14:38.008 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1475226 ']' 00:14:38.008 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1475226 00:14:38.008 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:14:38.008 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:14:38.008 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1475226 00:14:38.268 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:14:38.268 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:14:38.268 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1475226' 00:14:38.268 killing process with pid 1475226 00:14:38.268 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1475226 00:14:38.268 [2024-07-15 11:55:51.619179] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:38.268 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1475226 00:14:38.268 [2024-07-15 11:55:51.629873] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:38.268 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.39ZHsr8tIs 00:14:38.268 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:38.268 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:38.623 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:14:38.623 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:14:38.623 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:38.623 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:38.623 11:55:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:14:38.623 00:14:38.623 real 0m6.171s 00:14:38.623 user 0m9.567s 00:14:38.623 sys 0m1.155s 00:14:38.623 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:14:38.623 11:55:51 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.623 ************************************ 00:14:38.623 END TEST raid_write_error_test 00:14:38.623 ************************************ 00:14:38.623 11:55:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:14:38.623 11:55:51 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:14:38.623 11:55:51 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:38.623 11:55:51 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:14:38.623 11:55:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:14:38.623 11:55:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:14:38.623 11:55:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:38.623 ************************************ 00:14:38.623 START TEST raid_state_function_test 00:14:38.623 ************************************ 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 false 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1476039 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1476039' 00:14:38.623 Process raid pid: 1476039 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1476039 /var/tmp/spdk-raid.sock 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1476039 ']' 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:38.623 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:14:38.623 11:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.623 [2024-07-15 11:55:52.032166] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:14:38.623 [2024-07-15 11:55:52.032235] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:38.624 [2024-07-15 11:55:52.160547] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:38.883 [2024-07-15 11:55:52.258529] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:38.883 [2024-07-15 11:55:52.321790] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:38.883 [2024-07-15 11:55:52.321827] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:39.451 11:55:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:14:39.451 11:55:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:14:39.451 11:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:39.711 [2024-07-15 11:55:53.188304] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:39.711 [2024-07-15 11:55:53.188347] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:39.711 [2024-07-15 11:55:53.188358] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:39.711 [2024-07-15 11:55:53.188370] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:39.711 [2024-07-15 11:55:53.188379] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:39.711 [2024-07-15 11:55:53.188393] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.711 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.972 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.972 "name": "Existed_Raid", 00:14:39.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.972 "strip_size_kb": 64, 00:14:39.972 "state": "configuring", 00:14:39.972 "raid_level": "raid0", 00:14:39.972 "superblock": false, 00:14:39.972 "num_base_bdevs": 3, 00:14:39.972 "num_base_bdevs_discovered": 0, 00:14:39.972 "num_base_bdevs_operational": 3, 00:14:39.972 "base_bdevs_list": [ 00:14:39.972 { 00:14:39.972 "name": "BaseBdev1", 00:14:39.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.972 "is_configured": false, 00:14:39.972 "data_offset": 0, 00:14:39.972 "data_size": 0 00:14:39.972 }, 00:14:39.972 { 00:14:39.972 "name": "BaseBdev2", 00:14:39.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.972 "is_configured": false, 00:14:39.972 "data_offset": 0, 00:14:39.972 "data_size": 0 00:14:39.972 }, 00:14:39.972 { 00:14:39.972 "name": "BaseBdev3", 00:14:39.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.972 "is_configured": false, 00:14:39.972 "data_offset": 0, 00:14:39.972 "data_size": 0 00:14:39.972 } 00:14:39.972 ] 00:14:39.972 }' 00:14:39.972 11:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.972 11:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.541 11:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:40.799 [2024-07-15 11:55:54.299085] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:40.799 [2024-07-15 11:55:54.299116] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd9b00 name Existed_Raid, state configuring 00:14:40.799 11:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:41.059 [2024-07-15 11:55:54.547767] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:41.059 [2024-07-15 11:55:54.547798] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:41.059 [2024-07-15 11:55:54.547808] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:41.059 [2024-07-15 11:55:54.547819] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:41.059 [2024-07-15 11:55:54.547828] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:41.059 [2024-07-15 11:55:54.547839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:41.059 11:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:41.318 [2024-07-15 11:55:54.806356] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:41.318 BaseBdev1 00:14:41.318 11:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:41.318 11:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:41.318 11:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:41.318 11:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:41.318 11:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:41.318 11:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:41.318 11:55:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:41.577 11:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:41.837 [ 00:14:41.837 { 00:14:41.837 "name": "BaseBdev1", 00:14:41.837 "aliases": [ 00:14:41.837 "ed346f6a-9f87-484f-bd3f-beed15449221" 00:14:41.837 ], 00:14:41.837 "product_name": "Malloc disk", 00:14:41.837 "block_size": 512, 00:14:41.837 "num_blocks": 65536, 00:14:41.837 "uuid": "ed346f6a-9f87-484f-bd3f-beed15449221", 00:14:41.837 "assigned_rate_limits": { 00:14:41.837 "rw_ios_per_sec": 0, 00:14:41.837 "rw_mbytes_per_sec": 0, 00:14:41.837 "r_mbytes_per_sec": 0, 00:14:41.837 "w_mbytes_per_sec": 0 00:14:41.837 }, 00:14:41.837 "claimed": true, 00:14:41.837 "claim_type": "exclusive_write", 00:14:41.837 "zoned": false, 00:14:41.837 "supported_io_types": { 00:14:41.837 "read": true, 00:14:41.837 "write": true, 00:14:41.837 "unmap": true, 00:14:41.837 "flush": true, 00:14:41.837 "reset": true, 00:14:41.837 "nvme_admin": false, 00:14:41.837 "nvme_io": false, 00:14:41.837 "nvme_io_md": false, 00:14:41.837 "write_zeroes": true, 00:14:41.837 "zcopy": true, 00:14:41.837 "get_zone_info": false, 00:14:41.837 "zone_management": false, 00:14:41.837 "zone_append": false, 00:14:41.837 "compare": false, 00:14:41.837 "compare_and_write": false, 00:14:41.837 "abort": true, 00:14:41.837 "seek_hole": false, 00:14:41.837 "seek_data": false, 00:14:41.837 "copy": true, 00:14:41.837 "nvme_iov_md": false 00:14:41.837 }, 00:14:41.837 "memory_domains": [ 00:14:41.837 { 00:14:41.837 "dma_device_id": "system", 00:14:41.837 "dma_device_type": 1 00:14:41.837 }, 00:14:41.837 { 00:14:41.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.837 "dma_device_type": 2 00:14:41.837 } 00:14:41.837 ], 00:14:41.837 "driver_specific": {} 00:14:41.837 } 00:14:41.837 ] 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.837 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:42.097 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:42.097 "name": "Existed_Raid", 00:14:42.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.097 "strip_size_kb": 64, 00:14:42.097 "state": "configuring", 00:14:42.097 "raid_level": "raid0", 00:14:42.097 "superblock": false, 00:14:42.097 "num_base_bdevs": 3, 00:14:42.097 "num_base_bdevs_discovered": 1, 00:14:42.097 "num_base_bdevs_operational": 3, 00:14:42.097 "base_bdevs_list": [ 00:14:42.097 { 00:14:42.097 "name": "BaseBdev1", 00:14:42.097 "uuid": "ed346f6a-9f87-484f-bd3f-beed15449221", 00:14:42.097 "is_configured": true, 00:14:42.097 "data_offset": 0, 00:14:42.097 "data_size": 65536 00:14:42.097 }, 00:14:42.097 { 00:14:42.097 "name": "BaseBdev2", 00:14:42.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.097 "is_configured": false, 00:14:42.097 "data_offset": 0, 00:14:42.097 "data_size": 0 00:14:42.097 }, 00:14:42.097 { 00:14:42.097 "name": "BaseBdev3", 00:14:42.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:42.097 "is_configured": false, 00:14:42.097 "data_offset": 0, 00:14:42.097 "data_size": 0 00:14:42.097 } 00:14:42.097 ] 00:14:42.097 }' 00:14:42.097 11:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:42.097 11:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.666 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:42.925 [2024-07-15 11:55:56.350548] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:42.925 [2024-07-15 11:55:56.350585] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd9390 name Existed_Raid, state configuring 00:14:42.925 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:43.184 [2024-07-15 11:55:56.595395] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:43.184 [2024-07-15 11:55:56.596821] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:43.184 [2024-07-15 11:55:56.596855] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:43.184 [2024-07-15 11:55:56.596865] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:43.184 [2024-07-15 11:55:56.596877] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.184 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.444 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.444 "name": "Existed_Raid", 00:14:43.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.444 "strip_size_kb": 64, 00:14:43.444 "state": "configuring", 00:14:43.444 "raid_level": "raid0", 00:14:43.444 "superblock": false, 00:14:43.444 "num_base_bdevs": 3, 00:14:43.444 "num_base_bdevs_discovered": 1, 00:14:43.444 "num_base_bdevs_operational": 3, 00:14:43.444 "base_bdevs_list": [ 00:14:43.444 { 00:14:43.444 "name": "BaseBdev1", 00:14:43.444 "uuid": "ed346f6a-9f87-484f-bd3f-beed15449221", 00:14:43.444 "is_configured": true, 00:14:43.444 "data_offset": 0, 00:14:43.444 "data_size": 65536 00:14:43.444 }, 00:14:43.444 { 00:14:43.444 "name": "BaseBdev2", 00:14:43.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.444 "is_configured": false, 00:14:43.444 "data_offset": 0, 00:14:43.444 "data_size": 0 00:14:43.444 }, 00:14:43.444 { 00:14:43.444 "name": "BaseBdev3", 00:14:43.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:43.444 "is_configured": false, 00:14:43.444 "data_offset": 0, 00:14:43.444 "data_size": 0 00:14:43.444 } 00:14:43.444 ] 00:14:43.444 }' 00:14:43.444 11:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.444 11:55:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:44.012 11:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:44.272 [2024-07-15 11:55:57.697666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:44.272 BaseBdev2 00:14:44.272 11:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:44.272 11:55:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:44.272 11:55:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:44.272 11:55:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:44.272 11:55:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:44.272 11:55:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:44.272 11:55:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:44.531 11:55:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:44.791 [ 00:14:44.791 { 00:14:44.791 "name": "BaseBdev2", 00:14:44.791 "aliases": [ 00:14:44.791 "75ab0c0d-ade0-4e53-bf36-b485d2cda852" 00:14:44.791 ], 00:14:44.791 "product_name": "Malloc disk", 00:14:44.791 "block_size": 512, 00:14:44.791 "num_blocks": 65536, 00:14:44.791 "uuid": "75ab0c0d-ade0-4e53-bf36-b485d2cda852", 00:14:44.791 "assigned_rate_limits": { 00:14:44.791 "rw_ios_per_sec": 0, 00:14:44.791 "rw_mbytes_per_sec": 0, 00:14:44.791 "r_mbytes_per_sec": 0, 00:14:44.791 "w_mbytes_per_sec": 0 00:14:44.791 }, 00:14:44.791 "claimed": true, 00:14:44.791 "claim_type": "exclusive_write", 00:14:44.791 "zoned": false, 00:14:44.791 "supported_io_types": { 00:14:44.791 "read": true, 00:14:44.791 "write": true, 00:14:44.791 "unmap": true, 00:14:44.791 "flush": true, 00:14:44.791 "reset": true, 00:14:44.791 "nvme_admin": false, 00:14:44.791 "nvme_io": false, 00:14:44.791 "nvme_io_md": false, 00:14:44.791 "write_zeroes": true, 00:14:44.791 "zcopy": true, 00:14:44.791 "get_zone_info": false, 00:14:44.791 "zone_management": false, 00:14:44.791 "zone_append": false, 00:14:44.791 "compare": false, 00:14:44.791 "compare_and_write": false, 00:14:44.791 "abort": true, 00:14:44.791 "seek_hole": false, 00:14:44.791 "seek_data": false, 00:14:44.791 "copy": true, 00:14:44.791 "nvme_iov_md": false 00:14:44.791 }, 00:14:44.791 "memory_domains": [ 00:14:44.791 { 00:14:44.791 "dma_device_id": "system", 00:14:44.791 "dma_device_type": 1 00:14:44.791 }, 00:14:44.791 { 00:14:44.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.791 "dma_device_type": 2 00:14:44.791 } 00:14:44.791 ], 00:14:44.791 "driver_specific": {} 00:14:44.791 } 00:14:44.791 ] 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.791 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:45.051 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:45.051 "name": "Existed_Raid", 00:14:45.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.051 "strip_size_kb": 64, 00:14:45.051 "state": "configuring", 00:14:45.051 "raid_level": "raid0", 00:14:45.051 "superblock": false, 00:14:45.051 "num_base_bdevs": 3, 00:14:45.051 "num_base_bdevs_discovered": 2, 00:14:45.051 "num_base_bdevs_operational": 3, 00:14:45.051 "base_bdevs_list": [ 00:14:45.051 { 00:14:45.051 "name": "BaseBdev1", 00:14:45.051 "uuid": "ed346f6a-9f87-484f-bd3f-beed15449221", 00:14:45.051 "is_configured": true, 00:14:45.051 "data_offset": 0, 00:14:45.051 "data_size": 65536 00:14:45.051 }, 00:14:45.051 { 00:14:45.051 "name": "BaseBdev2", 00:14:45.051 "uuid": "75ab0c0d-ade0-4e53-bf36-b485d2cda852", 00:14:45.051 "is_configured": true, 00:14:45.051 "data_offset": 0, 00:14:45.051 "data_size": 65536 00:14:45.051 }, 00:14:45.051 { 00:14:45.051 "name": "BaseBdev3", 00:14:45.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:45.051 "is_configured": false, 00:14:45.051 "data_offset": 0, 00:14:45.051 "data_size": 0 00:14:45.051 } 00:14:45.051 ] 00:14:45.051 }' 00:14:45.051 11:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:45.051 11:55:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:45.619 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:45.879 [2024-07-15 11:55:59.314540] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:45.879 [2024-07-15 11:55:59.314589] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcda480 00:14:45.879 [2024-07-15 11:55:59.314598] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:45.879 [2024-07-15 11:55:59.314826] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9e05d0 00:14:45.879 [2024-07-15 11:55:59.314947] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcda480 00:14:45.879 [2024-07-15 11:55:59.314957] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcda480 00:14:45.879 [2024-07-15 11:55:59.315123] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:45.879 BaseBdev3 00:14:45.879 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:45.879 11:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:45.879 11:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:45.879 11:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:45.879 11:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:45.879 11:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:45.879 11:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:46.138 11:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:46.397 [ 00:14:46.397 { 00:14:46.397 "name": "BaseBdev3", 00:14:46.397 "aliases": [ 00:14:46.397 "600441fd-9acf-4b1d-9c15-f76a03f70f91" 00:14:46.397 ], 00:14:46.397 "product_name": "Malloc disk", 00:14:46.397 "block_size": 512, 00:14:46.397 "num_blocks": 65536, 00:14:46.397 "uuid": "600441fd-9acf-4b1d-9c15-f76a03f70f91", 00:14:46.397 "assigned_rate_limits": { 00:14:46.397 "rw_ios_per_sec": 0, 00:14:46.397 "rw_mbytes_per_sec": 0, 00:14:46.397 "r_mbytes_per_sec": 0, 00:14:46.397 "w_mbytes_per_sec": 0 00:14:46.397 }, 00:14:46.397 "claimed": true, 00:14:46.397 "claim_type": "exclusive_write", 00:14:46.397 "zoned": false, 00:14:46.397 "supported_io_types": { 00:14:46.397 "read": true, 00:14:46.397 "write": true, 00:14:46.397 "unmap": true, 00:14:46.397 "flush": true, 00:14:46.397 "reset": true, 00:14:46.397 "nvme_admin": false, 00:14:46.397 "nvme_io": false, 00:14:46.397 "nvme_io_md": false, 00:14:46.397 "write_zeroes": true, 00:14:46.397 "zcopy": true, 00:14:46.397 "get_zone_info": false, 00:14:46.397 "zone_management": false, 00:14:46.397 "zone_append": false, 00:14:46.397 "compare": false, 00:14:46.397 "compare_and_write": false, 00:14:46.397 "abort": true, 00:14:46.397 "seek_hole": false, 00:14:46.397 "seek_data": false, 00:14:46.397 "copy": true, 00:14:46.397 "nvme_iov_md": false 00:14:46.397 }, 00:14:46.397 "memory_domains": [ 00:14:46.397 { 00:14:46.397 "dma_device_id": "system", 00:14:46.397 "dma_device_type": 1 00:14:46.397 }, 00:14:46.397 { 00:14:46.397 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.397 "dma_device_type": 2 00:14:46.397 } 00:14:46.397 ], 00:14:46.397 "driver_specific": {} 00:14:46.397 } 00:14:46.397 ] 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.397 11:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:46.657 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.657 "name": "Existed_Raid", 00:14:46.657 "uuid": "4271244f-45d4-483e-94ea-49b7025e0fcf", 00:14:46.657 "strip_size_kb": 64, 00:14:46.657 "state": "online", 00:14:46.657 "raid_level": "raid0", 00:14:46.657 "superblock": false, 00:14:46.657 "num_base_bdevs": 3, 00:14:46.657 "num_base_bdevs_discovered": 3, 00:14:46.657 "num_base_bdevs_operational": 3, 00:14:46.657 "base_bdevs_list": [ 00:14:46.657 { 00:14:46.657 "name": "BaseBdev1", 00:14:46.657 "uuid": "ed346f6a-9f87-484f-bd3f-beed15449221", 00:14:46.657 "is_configured": true, 00:14:46.657 "data_offset": 0, 00:14:46.657 "data_size": 65536 00:14:46.657 }, 00:14:46.657 { 00:14:46.657 "name": "BaseBdev2", 00:14:46.657 "uuid": "75ab0c0d-ade0-4e53-bf36-b485d2cda852", 00:14:46.657 "is_configured": true, 00:14:46.657 "data_offset": 0, 00:14:46.657 "data_size": 65536 00:14:46.657 }, 00:14:46.657 { 00:14:46.657 "name": "BaseBdev3", 00:14:46.657 "uuid": "600441fd-9acf-4b1d-9c15-f76a03f70f91", 00:14:46.657 "is_configured": true, 00:14:46.657 "data_offset": 0, 00:14:46.657 "data_size": 65536 00:14:46.657 } 00:14:46.657 ] 00:14:46.657 }' 00:14:46.657 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.657 11:56:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.226 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:47.226 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:47.226 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:47.226 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:47.226 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:47.226 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:47.226 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:47.226 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:47.484 [2024-07-15 11:56:00.907224] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:47.484 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:47.484 "name": "Existed_Raid", 00:14:47.484 "aliases": [ 00:14:47.484 "4271244f-45d4-483e-94ea-49b7025e0fcf" 00:14:47.484 ], 00:14:47.484 "product_name": "Raid Volume", 00:14:47.484 "block_size": 512, 00:14:47.484 "num_blocks": 196608, 00:14:47.485 "uuid": "4271244f-45d4-483e-94ea-49b7025e0fcf", 00:14:47.485 "assigned_rate_limits": { 00:14:47.485 "rw_ios_per_sec": 0, 00:14:47.485 "rw_mbytes_per_sec": 0, 00:14:47.485 "r_mbytes_per_sec": 0, 00:14:47.485 "w_mbytes_per_sec": 0 00:14:47.485 }, 00:14:47.485 "claimed": false, 00:14:47.485 "zoned": false, 00:14:47.485 "supported_io_types": { 00:14:47.485 "read": true, 00:14:47.485 "write": true, 00:14:47.485 "unmap": true, 00:14:47.485 "flush": true, 00:14:47.485 "reset": true, 00:14:47.485 "nvme_admin": false, 00:14:47.485 "nvme_io": false, 00:14:47.485 "nvme_io_md": false, 00:14:47.485 "write_zeroes": true, 00:14:47.485 "zcopy": false, 00:14:47.485 "get_zone_info": false, 00:14:47.485 "zone_management": false, 00:14:47.485 "zone_append": false, 00:14:47.485 "compare": false, 00:14:47.485 "compare_and_write": false, 00:14:47.485 "abort": false, 00:14:47.485 "seek_hole": false, 00:14:47.485 "seek_data": false, 00:14:47.485 "copy": false, 00:14:47.485 "nvme_iov_md": false 00:14:47.485 }, 00:14:47.485 "memory_domains": [ 00:14:47.485 { 00:14:47.485 "dma_device_id": "system", 00:14:47.485 "dma_device_type": 1 00:14:47.485 }, 00:14:47.485 { 00:14:47.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.485 "dma_device_type": 2 00:14:47.485 }, 00:14:47.485 { 00:14:47.485 "dma_device_id": "system", 00:14:47.485 "dma_device_type": 1 00:14:47.485 }, 00:14:47.485 { 00:14:47.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.485 "dma_device_type": 2 00:14:47.485 }, 00:14:47.485 { 00:14:47.485 "dma_device_id": "system", 00:14:47.485 "dma_device_type": 1 00:14:47.485 }, 00:14:47.485 { 00:14:47.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.485 "dma_device_type": 2 00:14:47.485 } 00:14:47.485 ], 00:14:47.485 "driver_specific": { 00:14:47.485 "raid": { 00:14:47.485 "uuid": "4271244f-45d4-483e-94ea-49b7025e0fcf", 00:14:47.485 "strip_size_kb": 64, 00:14:47.485 "state": "online", 00:14:47.485 "raid_level": "raid0", 00:14:47.485 "superblock": false, 00:14:47.485 "num_base_bdevs": 3, 00:14:47.485 "num_base_bdevs_discovered": 3, 00:14:47.485 "num_base_bdevs_operational": 3, 00:14:47.485 "base_bdevs_list": [ 00:14:47.485 { 00:14:47.485 "name": "BaseBdev1", 00:14:47.485 "uuid": "ed346f6a-9f87-484f-bd3f-beed15449221", 00:14:47.485 "is_configured": true, 00:14:47.485 "data_offset": 0, 00:14:47.485 "data_size": 65536 00:14:47.485 }, 00:14:47.485 { 00:14:47.485 "name": "BaseBdev2", 00:14:47.485 "uuid": "75ab0c0d-ade0-4e53-bf36-b485d2cda852", 00:14:47.485 "is_configured": true, 00:14:47.485 "data_offset": 0, 00:14:47.485 "data_size": 65536 00:14:47.485 }, 00:14:47.485 { 00:14:47.485 "name": "BaseBdev3", 00:14:47.485 "uuid": "600441fd-9acf-4b1d-9c15-f76a03f70f91", 00:14:47.485 "is_configured": true, 00:14:47.485 "data_offset": 0, 00:14:47.485 "data_size": 65536 00:14:47.485 } 00:14:47.485 ] 00:14:47.485 } 00:14:47.485 } 00:14:47.485 }' 00:14:47.485 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:47.485 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:47.485 BaseBdev2 00:14:47.485 BaseBdev3' 00:14:47.485 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:47.485 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:47.485 11:56:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:47.743 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:47.743 "name": "BaseBdev1", 00:14:47.743 "aliases": [ 00:14:47.743 "ed346f6a-9f87-484f-bd3f-beed15449221" 00:14:47.743 ], 00:14:47.743 "product_name": "Malloc disk", 00:14:47.743 "block_size": 512, 00:14:47.743 "num_blocks": 65536, 00:14:47.743 "uuid": "ed346f6a-9f87-484f-bd3f-beed15449221", 00:14:47.743 "assigned_rate_limits": { 00:14:47.743 "rw_ios_per_sec": 0, 00:14:47.743 "rw_mbytes_per_sec": 0, 00:14:47.743 "r_mbytes_per_sec": 0, 00:14:47.743 "w_mbytes_per_sec": 0 00:14:47.743 }, 00:14:47.743 "claimed": true, 00:14:47.743 "claim_type": "exclusive_write", 00:14:47.743 "zoned": false, 00:14:47.743 "supported_io_types": { 00:14:47.743 "read": true, 00:14:47.743 "write": true, 00:14:47.743 "unmap": true, 00:14:47.743 "flush": true, 00:14:47.743 "reset": true, 00:14:47.743 "nvme_admin": false, 00:14:47.743 "nvme_io": false, 00:14:47.743 "nvme_io_md": false, 00:14:47.743 "write_zeroes": true, 00:14:47.743 "zcopy": true, 00:14:47.743 "get_zone_info": false, 00:14:47.743 "zone_management": false, 00:14:47.743 "zone_append": false, 00:14:47.743 "compare": false, 00:14:47.743 "compare_and_write": false, 00:14:47.743 "abort": true, 00:14:47.743 "seek_hole": false, 00:14:47.743 "seek_data": false, 00:14:47.743 "copy": true, 00:14:47.743 "nvme_iov_md": false 00:14:47.743 }, 00:14:47.743 "memory_domains": [ 00:14:47.744 { 00:14:47.744 "dma_device_id": "system", 00:14:47.744 "dma_device_type": 1 00:14:47.744 }, 00:14:47.744 { 00:14:47.744 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:47.744 "dma_device_type": 2 00:14:47.744 } 00:14:47.744 ], 00:14:47.744 "driver_specific": {} 00:14:47.744 }' 00:14:47.744 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.744 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:47.744 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:47.744 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:48.002 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:48.261 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:48.261 "name": "BaseBdev2", 00:14:48.261 "aliases": [ 00:14:48.261 "75ab0c0d-ade0-4e53-bf36-b485d2cda852" 00:14:48.261 ], 00:14:48.261 "product_name": "Malloc disk", 00:14:48.261 "block_size": 512, 00:14:48.261 "num_blocks": 65536, 00:14:48.261 "uuid": "75ab0c0d-ade0-4e53-bf36-b485d2cda852", 00:14:48.261 "assigned_rate_limits": { 00:14:48.261 "rw_ios_per_sec": 0, 00:14:48.261 "rw_mbytes_per_sec": 0, 00:14:48.261 "r_mbytes_per_sec": 0, 00:14:48.261 "w_mbytes_per_sec": 0 00:14:48.261 }, 00:14:48.261 "claimed": true, 00:14:48.261 "claim_type": "exclusive_write", 00:14:48.261 "zoned": false, 00:14:48.261 "supported_io_types": { 00:14:48.261 "read": true, 00:14:48.261 "write": true, 00:14:48.261 "unmap": true, 00:14:48.261 "flush": true, 00:14:48.261 "reset": true, 00:14:48.261 "nvme_admin": false, 00:14:48.261 "nvme_io": false, 00:14:48.261 "nvme_io_md": false, 00:14:48.261 "write_zeroes": true, 00:14:48.261 "zcopy": true, 00:14:48.261 "get_zone_info": false, 00:14:48.261 "zone_management": false, 00:14:48.261 "zone_append": false, 00:14:48.261 "compare": false, 00:14:48.261 "compare_and_write": false, 00:14:48.261 "abort": true, 00:14:48.261 "seek_hole": false, 00:14:48.261 "seek_data": false, 00:14:48.261 "copy": true, 00:14:48.261 "nvme_iov_md": false 00:14:48.261 }, 00:14:48.261 "memory_domains": [ 00:14:48.261 { 00:14:48.261 "dma_device_id": "system", 00:14:48.261 "dma_device_type": 1 00:14:48.261 }, 00:14:48.261 { 00:14:48.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:48.261 "dma_device_type": 2 00:14:48.261 } 00:14:48.261 ], 00:14:48.261 "driver_specific": {} 00:14:48.261 }' 00:14:48.261 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.520 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:48.520 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:48.520 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.520 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:48.520 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:48.520 11:56:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.520 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:48.520 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:48.520 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.779 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:48.779 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:48.779 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:48.779 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:48.779 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:49.050 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:49.050 "name": "BaseBdev3", 00:14:49.050 "aliases": [ 00:14:49.050 "600441fd-9acf-4b1d-9c15-f76a03f70f91" 00:14:49.050 ], 00:14:49.050 "product_name": "Malloc disk", 00:14:49.050 "block_size": 512, 00:14:49.050 "num_blocks": 65536, 00:14:49.050 "uuid": "600441fd-9acf-4b1d-9c15-f76a03f70f91", 00:14:49.050 "assigned_rate_limits": { 00:14:49.050 "rw_ios_per_sec": 0, 00:14:49.050 "rw_mbytes_per_sec": 0, 00:14:49.050 "r_mbytes_per_sec": 0, 00:14:49.050 "w_mbytes_per_sec": 0 00:14:49.050 }, 00:14:49.050 "claimed": true, 00:14:49.050 "claim_type": "exclusive_write", 00:14:49.050 "zoned": false, 00:14:49.050 "supported_io_types": { 00:14:49.050 "read": true, 00:14:49.050 "write": true, 00:14:49.050 "unmap": true, 00:14:49.050 "flush": true, 00:14:49.050 "reset": true, 00:14:49.050 "nvme_admin": false, 00:14:49.050 "nvme_io": false, 00:14:49.050 "nvme_io_md": false, 00:14:49.050 "write_zeroes": true, 00:14:49.050 "zcopy": true, 00:14:49.050 "get_zone_info": false, 00:14:49.050 "zone_management": false, 00:14:49.050 "zone_append": false, 00:14:49.050 "compare": false, 00:14:49.050 "compare_and_write": false, 00:14:49.050 "abort": true, 00:14:49.050 "seek_hole": false, 00:14:49.050 "seek_data": false, 00:14:49.050 "copy": true, 00:14:49.050 "nvme_iov_md": false 00:14:49.050 }, 00:14:49.050 "memory_domains": [ 00:14:49.050 { 00:14:49.050 "dma_device_id": "system", 00:14:49.050 "dma_device_type": 1 00:14:49.050 }, 00:14:49.050 { 00:14:49.050 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:49.050 "dma_device_type": 2 00:14:49.050 } 00:14:49.050 ], 00:14:49.050 "driver_specific": {} 00:14:49.050 }' 00:14:49.050 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:49.050 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:49.050 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:49.050 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:49.050 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:49.050 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:49.050 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:49.308 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:49.308 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:49.308 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:49.308 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:49.308 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:49.308 11:56:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:49.567 [2024-07-15 11:56:03.008528] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:49.567 [2024-07-15 11:56:03.008559] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:49.567 [2024-07-15 11:56:03.008603] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.567 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.825 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.825 "name": "Existed_Raid", 00:14:49.825 "uuid": "4271244f-45d4-483e-94ea-49b7025e0fcf", 00:14:49.826 "strip_size_kb": 64, 00:14:49.826 "state": "offline", 00:14:49.826 "raid_level": "raid0", 00:14:49.826 "superblock": false, 00:14:49.826 "num_base_bdevs": 3, 00:14:49.826 "num_base_bdevs_discovered": 2, 00:14:49.826 "num_base_bdevs_operational": 2, 00:14:49.826 "base_bdevs_list": [ 00:14:49.826 { 00:14:49.826 "name": null, 00:14:49.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.826 "is_configured": false, 00:14:49.826 "data_offset": 0, 00:14:49.826 "data_size": 65536 00:14:49.826 }, 00:14:49.826 { 00:14:49.826 "name": "BaseBdev2", 00:14:49.826 "uuid": "75ab0c0d-ade0-4e53-bf36-b485d2cda852", 00:14:49.826 "is_configured": true, 00:14:49.826 "data_offset": 0, 00:14:49.826 "data_size": 65536 00:14:49.826 }, 00:14:49.826 { 00:14:49.826 "name": "BaseBdev3", 00:14:49.826 "uuid": "600441fd-9acf-4b1d-9c15-f76a03f70f91", 00:14:49.826 "is_configured": true, 00:14:49.826 "data_offset": 0, 00:14:49.826 "data_size": 65536 00:14:49.826 } 00:14:49.826 ] 00:14:49.826 }' 00:14:49.826 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.826 11:56:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.393 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:50.393 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:50.393 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:50.393 11:56:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.652 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:50.652 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:50.652 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:50.910 [2024-07-15 11:56:04.345212] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:50.910 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:50.910 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:50.910 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:50.910 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:51.168 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:51.168 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:51.168 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:51.426 [2024-07-15 11:56:04.801002] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:51.426 [2024-07-15 11:56:04.801051] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcda480 name Existed_Raid, state offline 00:14:51.426 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:51.426 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:51.426 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.426 11:56:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:51.684 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:51.684 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:51.684 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:51.684 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:51.684 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:51.684 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:51.943 BaseBdev2 00:14:51.943 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:51.943 11:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:14:51.943 11:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:51.943 11:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:51.943 11:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:51.943 11:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:51.943 11:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:52.201 11:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:52.201 [ 00:14:52.201 { 00:14:52.201 "name": "BaseBdev2", 00:14:52.201 "aliases": [ 00:14:52.201 "4f884e1f-7fad-4543-b508-b302225bd7be" 00:14:52.201 ], 00:14:52.201 "product_name": "Malloc disk", 00:14:52.201 "block_size": 512, 00:14:52.201 "num_blocks": 65536, 00:14:52.201 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:14:52.201 "assigned_rate_limits": { 00:14:52.201 "rw_ios_per_sec": 0, 00:14:52.201 "rw_mbytes_per_sec": 0, 00:14:52.201 "r_mbytes_per_sec": 0, 00:14:52.201 "w_mbytes_per_sec": 0 00:14:52.201 }, 00:14:52.201 "claimed": false, 00:14:52.201 "zoned": false, 00:14:52.201 "supported_io_types": { 00:14:52.201 "read": true, 00:14:52.201 "write": true, 00:14:52.201 "unmap": true, 00:14:52.201 "flush": true, 00:14:52.201 "reset": true, 00:14:52.201 "nvme_admin": false, 00:14:52.201 "nvme_io": false, 00:14:52.201 "nvme_io_md": false, 00:14:52.201 "write_zeroes": true, 00:14:52.201 "zcopy": true, 00:14:52.201 "get_zone_info": false, 00:14:52.201 "zone_management": false, 00:14:52.201 "zone_append": false, 00:14:52.201 "compare": false, 00:14:52.201 "compare_and_write": false, 00:14:52.201 "abort": true, 00:14:52.201 "seek_hole": false, 00:14:52.201 "seek_data": false, 00:14:52.201 "copy": true, 00:14:52.201 "nvme_iov_md": false 00:14:52.201 }, 00:14:52.201 "memory_domains": [ 00:14:52.201 { 00:14:52.201 "dma_device_id": "system", 00:14:52.201 "dma_device_type": 1 00:14:52.201 }, 00:14:52.201 { 00:14:52.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.201 "dma_device_type": 2 00:14:52.201 } 00:14:52.201 ], 00:14:52.201 "driver_specific": {} 00:14:52.201 } 00:14:52.202 ] 00:14:52.460 11:56:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:52.460 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:52.460 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:52.460 11:56:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:52.460 BaseBdev3 00:14:52.720 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:52.720 11:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:14:52.720 11:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:52.720 11:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:52.720 11:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:52.720 11:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:52.720 11:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:52.720 11:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:52.979 [ 00:14:52.979 { 00:14:52.979 "name": "BaseBdev3", 00:14:52.979 "aliases": [ 00:14:52.979 "0fd81c94-3565-465a-9cf4-0d6bca402c77" 00:14:52.979 ], 00:14:52.979 "product_name": "Malloc disk", 00:14:52.979 "block_size": 512, 00:14:52.979 "num_blocks": 65536, 00:14:52.979 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:14:52.979 "assigned_rate_limits": { 00:14:52.979 "rw_ios_per_sec": 0, 00:14:52.979 "rw_mbytes_per_sec": 0, 00:14:52.979 "r_mbytes_per_sec": 0, 00:14:52.979 "w_mbytes_per_sec": 0 00:14:52.979 }, 00:14:52.979 "claimed": false, 00:14:52.979 "zoned": false, 00:14:52.979 "supported_io_types": { 00:14:52.979 "read": true, 00:14:52.979 "write": true, 00:14:52.979 "unmap": true, 00:14:52.979 "flush": true, 00:14:52.979 "reset": true, 00:14:52.979 "nvme_admin": false, 00:14:52.979 "nvme_io": false, 00:14:52.979 "nvme_io_md": false, 00:14:52.979 "write_zeroes": true, 00:14:52.979 "zcopy": true, 00:14:52.979 "get_zone_info": false, 00:14:52.979 "zone_management": false, 00:14:52.979 "zone_append": false, 00:14:52.979 "compare": false, 00:14:52.979 "compare_and_write": false, 00:14:52.979 "abort": true, 00:14:52.979 "seek_hole": false, 00:14:52.979 "seek_data": false, 00:14:52.979 "copy": true, 00:14:52.979 "nvme_iov_md": false 00:14:52.979 }, 00:14:52.979 "memory_domains": [ 00:14:52.979 { 00:14:52.979 "dma_device_id": "system", 00:14:52.979 "dma_device_type": 1 00:14:52.979 }, 00:14:52.979 { 00:14:52.979 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.979 "dma_device_type": 2 00:14:52.979 } 00:14:52.979 ], 00:14:52.979 "driver_specific": {} 00:14:52.979 } 00:14:52.979 ] 00:14:52.979 11:56:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:52.979 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:52.979 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:52.979 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:53.238 [2024-07-15 11:56:06.755043] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:53.238 [2024-07-15 11:56:06.755091] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:53.238 [2024-07-15 11:56:06.755114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:53.238 [2024-07-15 11:56:06.756671] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.238 11:56:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.497 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.497 "name": "Existed_Raid", 00:14:53.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.497 "strip_size_kb": 64, 00:14:53.497 "state": "configuring", 00:14:53.497 "raid_level": "raid0", 00:14:53.497 "superblock": false, 00:14:53.497 "num_base_bdevs": 3, 00:14:53.497 "num_base_bdevs_discovered": 2, 00:14:53.497 "num_base_bdevs_operational": 3, 00:14:53.497 "base_bdevs_list": [ 00:14:53.497 { 00:14:53.497 "name": "BaseBdev1", 00:14:53.497 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.497 "is_configured": false, 00:14:53.497 "data_offset": 0, 00:14:53.497 "data_size": 0 00:14:53.497 }, 00:14:53.497 { 00:14:53.497 "name": "BaseBdev2", 00:14:53.497 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:14:53.497 "is_configured": true, 00:14:53.497 "data_offset": 0, 00:14:53.497 "data_size": 65536 00:14:53.497 }, 00:14:53.497 { 00:14:53.497 "name": "BaseBdev3", 00:14:53.497 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:14:53.497 "is_configured": true, 00:14:53.497 "data_offset": 0, 00:14:53.497 "data_size": 65536 00:14:53.497 } 00:14:53.497 ] 00:14:53.497 }' 00:14:53.497 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.497 11:56:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.064 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:54.323 [2024-07-15 11:56:07.849927] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.323 11:56:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.582 11:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.582 "name": "Existed_Raid", 00:14:54.582 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.582 "strip_size_kb": 64, 00:14:54.582 "state": "configuring", 00:14:54.582 "raid_level": "raid0", 00:14:54.582 "superblock": false, 00:14:54.582 "num_base_bdevs": 3, 00:14:54.582 "num_base_bdevs_discovered": 1, 00:14:54.582 "num_base_bdevs_operational": 3, 00:14:54.582 "base_bdevs_list": [ 00:14:54.582 { 00:14:54.582 "name": "BaseBdev1", 00:14:54.582 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.582 "is_configured": false, 00:14:54.582 "data_offset": 0, 00:14:54.582 "data_size": 0 00:14:54.582 }, 00:14:54.582 { 00:14:54.582 "name": null, 00:14:54.582 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:14:54.582 "is_configured": false, 00:14:54.582 "data_offset": 0, 00:14:54.582 "data_size": 65536 00:14:54.582 }, 00:14:54.582 { 00:14:54.582 "name": "BaseBdev3", 00:14:54.582 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:14:54.582 "is_configured": true, 00:14:54.582 "data_offset": 0, 00:14:54.582 "data_size": 65536 00:14:54.582 } 00:14:54.582 ] 00:14:54.582 }' 00:14:54.582 11:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.582 11:56:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.150 11:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.150 11:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:55.408 11:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:55.408 11:56:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:55.668 [2024-07-15 11:56:09.217347] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:55.668 BaseBdev1 00:14:55.668 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:55.668 11:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:14:55.668 11:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:14:55.668 11:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:14:55.668 11:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:14:55.668 11:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:14:55.668 11:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:55.927 11:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:56.186 [ 00:14:56.186 { 00:14:56.186 "name": "BaseBdev1", 00:14:56.186 "aliases": [ 00:14:56.186 "dc144da0-ea08-4787-a3e7-5482f726daf5" 00:14:56.186 ], 00:14:56.186 "product_name": "Malloc disk", 00:14:56.186 "block_size": 512, 00:14:56.186 "num_blocks": 65536, 00:14:56.186 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:14:56.186 "assigned_rate_limits": { 00:14:56.186 "rw_ios_per_sec": 0, 00:14:56.186 "rw_mbytes_per_sec": 0, 00:14:56.186 "r_mbytes_per_sec": 0, 00:14:56.186 "w_mbytes_per_sec": 0 00:14:56.186 }, 00:14:56.186 "claimed": true, 00:14:56.186 "claim_type": "exclusive_write", 00:14:56.186 "zoned": false, 00:14:56.186 "supported_io_types": { 00:14:56.186 "read": true, 00:14:56.186 "write": true, 00:14:56.186 "unmap": true, 00:14:56.186 "flush": true, 00:14:56.186 "reset": true, 00:14:56.186 "nvme_admin": false, 00:14:56.186 "nvme_io": false, 00:14:56.186 "nvme_io_md": false, 00:14:56.186 "write_zeroes": true, 00:14:56.186 "zcopy": true, 00:14:56.186 "get_zone_info": false, 00:14:56.186 "zone_management": false, 00:14:56.187 "zone_append": false, 00:14:56.187 "compare": false, 00:14:56.187 "compare_and_write": false, 00:14:56.187 "abort": true, 00:14:56.187 "seek_hole": false, 00:14:56.187 "seek_data": false, 00:14:56.187 "copy": true, 00:14:56.187 "nvme_iov_md": false 00:14:56.187 }, 00:14:56.187 "memory_domains": [ 00:14:56.187 { 00:14:56.187 "dma_device_id": "system", 00:14:56.187 "dma_device_type": 1 00:14:56.187 }, 00:14:56.187 { 00:14:56.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.187 "dma_device_type": 2 00:14:56.187 } 00:14:56.187 ], 00:14:56.187 "driver_specific": {} 00:14:56.187 } 00:14:56.187 ] 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.187 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.445 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.445 "name": "Existed_Raid", 00:14:56.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:56.445 "strip_size_kb": 64, 00:14:56.445 "state": "configuring", 00:14:56.445 "raid_level": "raid0", 00:14:56.445 "superblock": false, 00:14:56.445 "num_base_bdevs": 3, 00:14:56.445 "num_base_bdevs_discovered": 2, 00:14:56.445 "num_base_bdevs_operational": 3, 00:14:56.445 "base_bdevs_list": [ 00:14:56.445 { 00:14:56.445 "name": "BaseBdev1", 00:14:56.445 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:14:56.445 "is_configured": true, 00:14:56.445 "data_offset": 0, 00:14:56.445 "data_size": 65536 00:14:56.445 }, 00:14:56.445 { 00:14:56.445 "name": null, 00:14:56.445 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:14:56.445 "is_configured": false, 00:14:56.445 "data_offset": 0, 00:14:56.445 "data_size": 65536 00:14:56.445 }, 00:14:56.445 { 00:14:56.445 "name": "BaseBdev3", 00:14:56.445 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:14:56.445 "is_configured": true, 00:14:56.445 "data_offset": 0, 00:14:56.445 "data_size": 65536 00:14:56.445 } 00:14:56.445 ] 00:14:56.445 }' 00:14:56.445 11:56:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.445 11:56:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.013 11:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.013 11:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:57.272 11:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:57.272 11:56:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:57.530 [2024-07-15 11:56:11.034179] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.530 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:57.788 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.788 "name": "Existed_Raid", 00:14:57.788 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:57.788 "strip_size_kb": 64, 00:14:57.788 "state": "configuring", 00:14:57.788 "raid_level": "raid0", 00:14:57.788 "superblock": false, 00:14:57.788 "num_base_bdevs": 3, 00:14:57.788 "num_base_bdevs_discovered": 1, 00:14:57.788 "num_base_bdevs_operational": 3, 00:14:57.788 "base_bdevs_list": [ 00:14:57.788 { 00:14:57.788 "name": "BaseBdev1", 00:14:57.788 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:14:57.788 "is_configured": true, 00:14:57.788 "data_offset": 0, 00:14:57.788 "data_size": 65536 00:14:57.788 }, 00:14:57.788 { 00:14:57.788 "name": null, 00:14:57.788 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:14:57.788 "is_configured": false, 00:14:57.788 "data_offset": 0, 00:14:57.788 "data_size": 65536 00:14:57.788 }, 00:14:57.788 { 00:14:57.788 "name": null, 00:14:57.788 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:14:57.788 "is_configured": false, 00:14:57.788 "data_offset": 0, 00:14:57.788 "data_size": 65536 00:14:57.788 } 00:14:57.788 ] 00:14:57.788 }' 00:14:57.788 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.788 11:56:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:58.355 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.356 11:56:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:58.614 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:58.614 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:58.872 [2024-07-15 11:56:12.365755] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.872 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.131 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.131 "name": "Existed_Raid", 00:14:59.131 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.131 "strip_size_kb": 64, 00:14:59.131 "state": "configuring", 00:14:59.131 "raid_level": "raid0", 00:14:59.131 "superblock": false, 00:14:59.131 "num_base_bdevs": 3, 00:14:59.131 "num_base_bdevs_discovered": 2, 00:14:59.131 "num_base_bdevs_operational": 3, 00:14:59.131 "base_bdevs_list": [ 00:14:59.131 { 00:14:59.131 "name": "BaseBdev1", 00:14:59.131 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:14:59.131 "is_configured": true, 00:14:59.131 "data_offset": 0, 00:14:59.131 "data_size": 65536 00:14:59.131 }, 00:14:59.131 { 00:14:59.131 "name": null, 00:14:59.131 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:14:59.131 "is_configured": false, 00:14:59.131 "data_offset": 0, 00:14:59.131 "data_size": 65536 00:14:59.131 }, 00:14:59.131 { 00:14:59.131 "name": "BaseBdev3", 00:14:59.131 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:14:59.131 "is_configured": true, 00:14:59.131 "data_offset": 0, 00:14:59.131 "data_size": 65536 00:14:59.131 } 00:14:59.131 ] 00:14:59.131 }' 00:14:59.131 11:56:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.131 11:56:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.701 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.701 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:00.047 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:00.047 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:00.306 [2024-07-15 11:56:13.693257] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.306 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:00.565 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:00.565 "name": "Existed_Raid", 00:15:00.565 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:00.565 "strip_size_kb": 64, 00:15:00.565 "state": "configuring", 00:15:00.565 "raid_level": "raid0", 00:15:00.565 "superblock": false, 00:15:00.565 "num_base_bdevs": 3, 00:15:00.565 "num_base_bdevs_discovered": 1, 00:15:00.565 "num_base_bdevs_operational": 3, 00:15:00.565 "base_bdevs_list": [ 00:15:00.565 { 00:15:00.565 "name": null, 00:15:00.565 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:15:00.565 "is_configured": false, 00:15:00.565 "data_offset": 0, 00:15:00.565 "data_size": 65536 00:15:00.565 }, 00:15:00.565 { 00:15:00.565 "name": null, 00:15:00.565 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:15:00.565 "is_configured": false, 00:15:00.565 "data_offset": 0, 00:15:00.565 "data_size": 65536 00:15:00.565 }, 00:15:00.565 { 00:15:00.565 "name": "BaseBdev3", 00:15:00.565 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:15:00.565 "is_configured": true, 00:15:00.565 "data_offset": 0, 00:15:00.565 "data_size": 65536 00:15:00.565 } 00:15:00.565 ] 00:15:00.565 }' 00:15:00.565 11:56:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:00.565 11:56:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.133 11:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.133 11:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:01.393 11:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:01.393 11:56:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:01.651 [2024-07-15 11:56:15.027598] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:01.651 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.652 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:01.652 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:01.652 "name": "Existed_Raid", 00:15:01.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:01.652 "strip_size_kb": 64, 00:15:01.652 "state": "configuring", 00:15:01.652 "raid_level": "raid0", 00:15:01.652 "superblock": false, 00:15:01.652 "num_base_bdevs": 3, 00:15:01.652 "num_base_bdevs_discovered": 2, 00:15:01.652 "num_base_bdevs_operational": 3, 00:15:01.652 "base_bdevs_list": [ 00:15:01.652 { 00:15:01.652 "name": null, 00:15:01.652 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:15:01.652 "is_configured": false, 00:15:01.652 "data_offset": 0, 00:15:01.652 "data_size": 65536 00:15:01.652 }, 00:15:01.652 { 00:15:01.652 "name": "BaseBdev2", 00:15:01.652 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:15:01.652 "is_configured": true, 00:15:01.652 "data_offset": 0, 00:15:01.652 "data_size": 65536 00:15:01.652 }, 00:15:01.652 { 00:15:01.652 "name": "BaseBdev3", 00:15:01.652 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:15:01.652 "is_configured": true, 00:15:01.652 "data_offset": 0, 00:15:01.652 "data_size": 65536 00:15:01.652 } 00:15:01.652 ] 00:15:01.652 }' 00:15:01.652 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:01.652 11:56:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.589 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.589 11:56:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:02.589 11:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:02.589 11:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.589 11:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:02.848 11:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u dc144da0-ea08-4787-a3e7-5482f726daf5 00:15:03.108 [2024-07-15 11:56:16.575093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:03.108 [2024-07-15 11:56:16.575133] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcdac10 00:15:03.108 [2024-07-15 11:56:16.575143] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:03.108 [2024-07-15 11:56:16.575335] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbf4420 00:15:03.108 [2024-07-15 11:56:16.575447] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcdac10 00:15:03.108 [2024-07-15 11:56:16.575457] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xcdac10 00:15:03.108 [2024-07-15 11:56:16.575621] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:03.108 NewBaseBdev 00:15:03.108 11:56:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:03.108 11:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:03.108 11:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:03.108 11:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:15:03.108 11:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:03.108 11:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:03.108 11:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:03.367 11:56:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:03.626 [ 00:15:03.626 { 00:15:03.626 "name": "NewBaseBdev", 00:15:03.626 "aliases": [ 00:15:03.626 "dc144da0-ea08-4787-a3e7-5482f726daf5" 00:15:03.626 ], 00:15:03.626 "product_name": "Malloc disk", 00:15:03.626 "block_size": 512, 00:15:03.626 "num_blocks": 65536, 00:15:03.626 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:15:03.626 "assigned_rate_limits": { 00:15:03.626 "rw_ios_per_sec": 0, 00:15:03.626 "rw_mbytes_per_sec": 0, 00:15:03.626 "r_mbytes_per_sec": 0, 00:15:03.626 "w_mbytes_per_sec": 0 00:15:03.626 }, 00:15:03.626 "claimed": true, 00:15:03.626 "claim_type": "exclusive_write", 00:15:03.626 "zoned": false, 00:15:03.626 "supported_io_types": { 00:15:03.626 "read": true, 00:15:03.626 "write": true, 00:15:03.626 "unmap": true, 00:15:03.626 "flush": true, 00:15:03.626 "reset": true, 00:15:03.626 "nvme_admin": false, 00:15:03.626 "nvme_io": false, 00:15:03.626 "nvme_io_md": false, 00:15:03.626 "write_zeroes": true, 00:15:03.626 "zcopy": true, 00:15:03.626 "get_zone_info": false, 00:15:03.626 "zone_management": false, 00:15:03.626 "zone_append": false, 00:15:03.626 "compare": false, 00:15:03.626 "compare_and_write": false, 00:15:03.626 "abort": true, 00:15:03.626 "seek_hole": false, 00:15:03.626 "seek_data": false, 00:15:03.626 "copy": true, 00:15:03.626 "nvme_iov_md": false 00:15:03.626 }, 00:15:03.626 "memory_domains": [ 00:15:03.626 { 00:15:03.626 "dma_device_id": "system", 00:15:03.626 "dma_device_type": 1 00:15:03.626 }, 00:15:03.626 { 00:15:03.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.627 "dma_device_type": 2 00:15:03.627 } 00:15:03.627 ], 00:15:03.627 "driver_specific": {} 00:15:03.627 } 00:15:03.627 ] 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.627 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:03.902 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.902 "name": "Existed_Raid", 00:15:03.902 "uuid": "690e9254-89be-42f8-b2ff-992e1af4fe00", 00:15:03.902 "strip_size_kb": 64, 00:15:03.902 "state": "online", 00:15:03.902 "raid_level": "raid0", 00:15:03.902 "superblock": false, 00:15:03.902 "num_base_bdevs": 3, 00:15:03.902 "num_base_bdevs_discovered": 3, 00:15:03.902 "num_base_bdevs_operational": 3, 00:15:03.902 "base_bdevs_list": [ 00:15:03.902 { 00:15:03.902 "name": "NewBaseBdev", 00:15:03.902 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:15:03.902 "is_configured": true, 00:15:03.902 "data_offset": 0, 00:15:03.902 "data_size": 65536 00:15:03.902 }, 00:15:03.902 { 00:15:03.902 "name": "BaseBdev2", 00:15:03.902 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:15:03.902 "is_configured": true, 00:15:03.902 "data_offset": 0, 00:15:03.902 "data_size": 65536 00:15:03.902 }, 00:15:03.902 { 00:15:03.902 "name": "BaseBdev3", 00:15:03.902 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:15:03.902 "is_configured": true, 00:15:03.902 "data_offset": 0, 00:15:03.902 "data_size": 65536 00:15:03.902 } 00:15:03.902 ] 00:15:03.902 }' 00:15:03.902 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.902 11:56:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.471 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:04.471 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:04.471 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:04.471 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:04.471 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:04.471 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:04.471 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:04.471 11:56:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:04.731 [2024-07-15 11:56:18.111500] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:04.731 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:04.731 "name": "Existed_Raid", 00:15:04.731 "aliases": [ 00:15:04.731 "690e9254-89be-42f8-b2ff-992e1af4fe00" 00:15:04.731 ], 00:15:04.731 "product_name": "Raid Volume", 00:15:04.731 "block_size": 512, 00:15:04.731 "num_blocks": 196608, 00:15:04.731 "uuid": "690e9254-89be-42f8-b2ff-992e1af4fe00", 00:15:04.731 "assigned_rate_limits": { 00:15:04.731 "rw_ios_per_sec": 0, 00:15:04.731 "rw_mbytes_per_sec": 0, 00:15:04.731 "r_mbytes_per_sec": 0, 00:15:04.731 "w_mbytes_per_sec": 0 00:15:04.731 }, 00:15:04.731 "claimed": false, 00:15:04.731 "zoned": false, 00:15:04.731 "supported_io_types": { 00:15:04.731 "read": true, 00:15:04.731 "write": true, 00:15:04.731 "unmap": true, 00:15:04.731 "flush": true, 00:15:04.731 "reset": true, 00:15:04.731 "nvme_admin": false, 00:15:04.731 "nvme_io": false, 00:15:04.731 "nvme_io_md": false, 00:15:04.731 "write_zeroes": true, 00:15:04.731 "zcopy": false, 00:15:04.731 "get_zone_info": false, 00:15:04.731 "zone_management": false, 00:15:04.731 "zone_append": false, 00:15:04.731 "compare": false, 00:15:04.731 "compare_and_write": false, 00:15:04.731 "abort": false, 00:15:04.731 "seek_hole": false, 00:15:04.731 "seek_data": false, 00:15:04.731 "copy": false, 00:15:04.731 "nvme_iov_md": false 00:15:04.731 }, 00:15:04.731 "memory_domains": [ 00:15:04.731 { 00:15:04.731 "dma_device_id": "system", 00:15:04.731 "dma_device_type": 1 00:15:04.731 }, 00:15:04.731 { 00:15:04.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.731 "dma_device_type": 2 00:15:04.731 }, 00:15:04.731 { 00:15:04.731 "dma_device_id": "system", 00:15:04.731 "dma_device_type": 1 00:15:04.731 }, 00:15:04.731 { 00:15:04.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.731 "dma_device_type": 2 00:15:04.731 }, 00:15:04.731 { 00:15:04.731 "dma_device_id": "system", 00:15:04.731 "dma_device_type": 1 00:15:04.731 }, 00:15:04.731 { 00:15:04.731 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.731 "dma_device_type": 2 00:15:04.731 } 00:15:04.731 ], 00:15:04.731 "driver_specific": { 00:15:04.731 "raid": { 00:15:04.731 "uuid": "690e9254-89be-42f8-b2ff-992e1af4fe00", 00:15:04.731 "strip_size_kb": 64, 00:15:04.731 "state": "online", 00:15:04.731 "raid_level": "raid0", 00:15:04.731 "superblock": false, 00:15:04.731 "num_base_bdevs": 3, 00:15:04.731 "num_base_bdevs_discovered": 3, 00:15:04.731 "num_base_bdevs_operational": 3, 00:15:04.731 "base_bdevs_list": [ 00:15:04.731 { 00:15:04.731 "name": "NewBaseBdev", 00:15:04.731 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:15:04.731 "is_configured": true, 00:15:04.731 "data_offset": 0, 00:15:04.731 "data_size": 65536 00:15:04.731 }, 00:15:04.731 { 00:15:04.731 "name": "BaseBdev2", 00:15:04.731 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:15:04.731 "is_configured": true, 00:15:04.731 "data_offset": 0, 00:15:04.731 "data_size": 65536 00:15:04.731 }, 00:15:04.731 { 00:15:04.731 "name": "BaseBdev3", 00:15:04.731 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:15:04.731 "is_configured": true, 00:15:04.731 "data_offset": 0, 00:15:04.731 "data_size": 65536 00:15:04.731 } 00:15:04.731 ] 00:15:04.731 } 00:15:04.731 } 00:15:04.731 }' 00:15:04.731 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:04.731 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:04.731 BaseBdev2 00:15:04.731 BaseBdev3' 00:15:04.731 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:04.731 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:04.731 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:04.991 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:04.991 "name": "NewBaseBdev", 00:15:04.991 "aliases": [ 00:15:04.991 "dc144da0-ea08-4787-a3e7-5482f726daf5" 00:15:04.991 ], 00:15:04.991 "product_name": "Malloc disk", 00:15:04.991 "block_size": 512, 00:15:04.991 "num_blocks": 65536, 00:15:04.991 "uuid": "dc144da0-ea08-4787-a3e7-5482f726daf5", 00:15:04.991 "assigned_rate_limits": { 00:15:04.991 "rw_ios_per_sec": 0, 00:15:04.991 "rw_mbytes_per_sec": 0, 00:15:04.991 "r_mbytes_per_sec": 0, 00:15:04.991 "w_mbytes_per_sec": 0 00:15:04.991 }, 00:15:04.991 "claimed": true, 00:15:04.991 "claim_type": "exclusive_write", 00:15:04.991 "zoned": false, 00:15:04.991 "supported_io_types": { 00:15:04.991 "read": true, 00:15:04.991 "write": true, 00:15:04.991 "unmap": true, 00:15:04.991 "flush": true, 00:15:04.991 "reset": true, 00:15:04.991 "nvme_admin": false, 00:15:04.991 "nvme_io": false, 00:15:04.991 "nvme_io_md": false, 00:15:04.991 "write_zeroes": true, 00:15:04.991 "zcopy": true, 00:15:04.991 "get_zone_info": false, 00:15:04.991 "zone_management": false, 00:15:04.991 "zone_append": false, 00:15:04.991 "compare": false, 00:15:04.991 "compare_and_write": false, 00:15:04.991 "abort": true, 00:15:04.991 "seek_hole": false, 00:15:04.992 "seek_data": false, 00:15:04.992 "copy": true, 00:15:04.992 "nvme_iov_md": false 00:15:04.992 }, 00:15:04.992 "memory_domains": [ 00:15:04.992 { 00:15:04.992 "dma_device_id": "system", 00:15:04.992 "dma_device_type": 1 00:15:04.992 }, 00:15:04.992 { 00:15:04.992 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:04.992 "dma_device_type": 2 00:15:04.992 } 00:15:04.992 ], 00:15:04.992 "driver_specific": {} 00:15:04.992 }' 00:15:04.992 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.992 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:04.992 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:04.992 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:04.992 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:05.251 11:56:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:05.510 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:05.510 "name": "BaseBdev2", 00:15:05.510 "aliases": [ 00:15:05.510 "4f884e1f-7fad-4543-b508-b302225bd7be" 00:15:05.510 ], 00:15:05.510 "product_name": "Malloc disk", 00:15:05.510 "block_size": 512, 00:15:05.510 "num_blocks": 65536, 00:15:05.510 "uuid": "4f884e1f-7fad-4543-b508-b302225bd7be", 00:15:05.510 "assigned_rate_limits": { 00:15:05.510 "rw_ios_per_sec": 0, 00:15:05.510 "rw_mbytes_per_sec": 0, 00:15:05.510 "r_mbytes_per_sec": 0, 00:15:05.510 "w_mbytes_per_sec": 0 00:15:05.510 }, 00:15:05.510 "claimed": true, 00:15:05.510 "claim_type": "exclusive_write", 00:15:05.510 "zoned": false, 00:15:05.510 "supported_io_types": { 00:15:05.510 "read": true, 00:15:05.510 "write": true, 00:15:05.510 "unmap": true, 00:15:05.510 "flush": true, 00:15:05.510 "reset": true, 00:15:05.510 "nvme_admin": false, 00:15:05.510 "nvme_io": false, 00:15:05.510 "nvme_io_md": false, 00:15:05.510 "write_zeroes": true, 00:15:05.510 "zcopy": true, 00:15:05.510 "get_zone_info": false, 00:15:05.510 "zone_management": false, 00:15:05.510 "zone_append": false, 00:15:05.510 "compare": false, 00:15:05.510 "compare_and_write": false, 00:15:05.510 "abort": true, 00:15:05.510 "seek_hole": false, 00:15:05.510 "seek_data": false, 00:15:05.510 "copy": true, 00:15:05.510 "nvme_iov_md": false 00:15:05.510 }, 00:15:05.510 "memory_domains": [ 00:15:05.510 { 00:15:05.510 "dma_device_id": "system", 00:15:05.510 "dma_device_type": 1 00:15:05.510 }, 00:15:05.510 { 00:15:05.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.510 "dma_device_type": 2 00:15:05.510 } 00:15:05.510 ], 00:15:05.510 "driver_specific": {} 00:15:05.510 }' 00:15:05.510 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.510 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:05.769 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:05.769 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.769 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:05.769 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:05.769 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.769 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:05.769 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:05.769 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:05.769 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.029 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:06.029 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:06.029 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:06.029 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:06.029 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:06.029 "name": "BaseBdev3", 00:15:06.029 "aliases": [ 00:15:06.029 "0fd81c94-3565-465a-9cf4-0d6bca402c77" 00:15:06.029 ], 00:15:06.029 "product_name": "Malloc disk", 00:15:06.029 "block_size": 512, 00:15:06.029 "num_blocks": 65536, 00:15:06.029 "uuid": "0fd81c94-3565-465a-9cf4-0d6bca402c77", 00:15:06.029 "assigned_rate_limits": { 00:15:06.029 "rw_ios_per_sec": 0, 00:15:06.029 "rw_mbytes_per_sec": 0, 00:15:06.029 "r_mbytes_per_sec": 0, 00:15:06.029 "w_mbytes_per_sec": 0 00:15:06.029 }, 00:15:06.029 "claimed": true, 00:15:06.029 "claim_type": "exclusive_write", 00:15:06.029 "zoned": false, 00:15:06.029 "supported_io_types": { 00:15:06.029 "read": true, 00:15:06.029 "write": true, 00:15:06.029 "unmap": true, 00:15:06.029 "flush": true, 00:15:06.029 "reset": true, 00:15:06.029 "nvme_admin": false, 00:15:06.029 "nvme_io": false, 00:15:06.029 "nvme_io_md": false, 00:15:06.029 "write_zeroes": true, 00:15:06.029 "zcopy": true, 00:15:06.029 "get_zone_info": false, 00:15:06.029 "zone_management": false, 00:15:06.029 "zone_append": false, 00:15:06.029 "compare": false, 00:15:06.029 "compare_and_write": false, 00:15:06.029 "abort": true, 00:15:06.029 "seek_hole": false, 00:15:06.029 "seek_data": false, 00:15:06.029 "copy": true, 00:15:06.029 "nvme_iov_md": false 00:15:06.029 }, 00:15:06.029 "memory_domains": [ 00:15:06.029 { 00:15:06.029 "dma_device_id": "system", 00:15:06.029 "dma_device_type": 1 00:15:06.029 }, 00:15:06.029 { 00:15:06.029 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.029 "dma_device_type": 2 00:15:06.029 } 00:15:06.029 ], 00:15:06.029 "driver_specific": {} 00:15:06.029 }' 00:15:06.288 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.288 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:06.288 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:06.288 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.288 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:06.288 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:06.288 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.288 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:06.288 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:06.546 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.546 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:06.546 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:06.546 11:56:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:06.804 [2024-07-15 11:56:20.188869] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:06.804 [2024-07-15 11:56:20.188901] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:06.804 [2024-07-15 11:56:20.188961] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:06.804 [2024-07-15 11:56:20.189014] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:06.804 [2024-07-15 11:56:20.189025] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcdac10 name Existed_Raid, state offline 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1476039 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1476039 ']' 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1476039 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1476039 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1476039' 00:15:06.804 killing process with pid 1476039 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1476039 00:15:06.804 [2024-07-15 11:56:20.254248] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:06.804 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1476039 00:15:06.804 [2024-07-15 11:56:20.284115] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:07.062 00:15:07.062 real 0m28.551s 00:15:07.062 user 0m52.355s 00:15:07.062 sys 0m5.154s 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:07.062 ************************************ 00:15:07.062 END TEST raid_state_function_test 00:15:07.062 ************************************ 00:15:07.062 11:56:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:07.062 11:56:20 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:15:07.062 11:56:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:07.062 11:56:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:07.062 11:56:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:07.062 ************************************ 00:15:07.062 START TEST raid_state_function_test_sb 00:15:07.062 ************************************ 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 3 true 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1480339 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1480339' 00:15:07.062 Process raid pid: 1480339 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1480339 /var/tmp/spdk-raid.sock 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1480339 ']' 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:07.062 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:07.062 11:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:07.321 [2024-07-15 11:56:20.666654] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:15:07.321 [2024-07-15 11:56:20.666727] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:07.321 [2024-07-15 11:56:20.786653] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:07.321 [2024-07-15 11:56:20.894544] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:07.579 [2024-07-15 11:56:20.957697] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:07.579 [2024-07-15 11:56:20.957725] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:07.579 11:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:07.579 11:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:15:07.579 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:07.838 [2024-07-15 11:56:21.358834] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:07.838 [2024-07-15 11:56:21.358874] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:07.838 [2024-07-15 11:56:21.358884] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:07.838 [2024-07-15 11:56:21.358896] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:07.838 [2024-07-15 11:56:21.358904] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:07.838 [2024-07-15 11:56:21.358915] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.838 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:08.096 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:08.096 "name": "Existed_Raid", 00:15:08.096 "uuid": "e22c0b71-b232-43b9-b089-9057a2804b21", 00:15:08.096 "strip_size_kb": 64, 00:15:08.096 "state": "configuring", 00:15:08.096 "raid_level": "raid0", 00:15:08.096 "superblock": true, 00:15:08.096 "num_base_bdevs": 3, 00:15:08.096 "num_base_bdevs_discovered": 0, 00:15:08.096 "num_base_bdevs_operational": 3, 00:15:08.096 "base_bdevs_list": [ 00:15:08.096 { 00:15:08.096 "name": "BaseBdev1", 00:15:08.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.096 "is_configured": false, 00:15:08.096 "data_offset": 0, 00:15:08.096 "data_size": 0 00:15:08.096 }, 00:15:08.096 { 00:15:08.096 "name": "BaseBdev2", 00:15:08.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.096 "is_configured": false, 00:15:08.096 "data_offset": 0, 00:15:08.096 "data_size": 0 00:15:08.096 }, 00:15:08.096 { 00:15:08.096 "name": "BaseBdev3", 00:15:08.096 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:08.096 "is_configured": false, 00:15:08.096 "data_offset": 0, 00:15:08.096 "data_size": 0 00:15:08.096 } 00:15:08.096 ] 00:15:08.096 }' 00:15:08.096 11:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:08.096 11:56:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:08.665 11:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:08.924 [2024-07-15 11:56:22.429524] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:08.924 [2024-07-15 11:56:22.429554] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1384b00 name Existed_Raid, state configuring 00:15:08.924 11:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:09.182 [2024-07-15 11:56:22.682226] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:09.182 [2024-07-15 11:56:22.682267] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:09.182 [2024-07-15 11:56:22.682277] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:09.182 [2024-07-15 11:56:22.682289] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:09.182 [2024-07-15 11:56:22.682297] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:09.182 [2024-07-15 11:56:22.682308] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:09.182 11:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:09.440 [2024-07-15 11:56:22.940793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:09.440 BaseBdev1 00:15:09.440 11:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:09.440 11:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:09.440 11:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:09.440 11:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:09.440 11:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:09.440 11:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:09.440 11:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:09.699 11:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:09.958 [ 00:15:09.958 { 00:15:09.958 "name": "BaseBdev1", 00:15:09.958 "aliases": [ 00:15:09.958 "1affe923-987e-48e8-b202-2aa51d65bb6c" 00:15:09.958 ], 00:15:09.958 "product_name": "Malloc disk", 00:15:09.958 "block_size": 512, 00:15:09.958 "num_blocks": 65536, 00:15:09.958 "uuid": "1affe923-987e-48e8-b202-2aa51d65bb6c", 00:15:09.958 "assigned_rate_limits": { 00:15:09.958 "rw_ios_per_sec": 0, 00:15:09.958 "rw_mbytes_per_sec": 0, 00:15:09.958 "r_mbytes_per_sec": 0, 00:15:09.958 "w_mbytes_per_sec": 0 00:15:09.958 }, 00:15:09.958 "claimed": true, 00:15:09.958 "claim_type": "exclusive_write", 00:15:09.958 "zoned": false, 00:15:09.958 "supported_io_types": { 00:15:09.958 "read": true, 00:15:09.958 "write": true, 00:15:09.958 "unmap": true, 00:15:09.958 "flush": true, 00:15:09.958 "reset": true, 00:15:09.958 "nvme_admin": false, 00:15:09.958 "nvme_io": false, 00:15:09.958 "nvme_io_md": false, 00:15:09.958 "write_zeroes": true, 00:15:09.958 "zcopy": true, 00:15:09.958 "get_zone_info": false, 00:15:09.958 "zone_management": false, 00:15:09.958 "zone_append": false, 00:15:09.958 "compare": false, 00:15:09.958 "compare_and_write": false, 00:15:09.958 "abort": true, 00:15:09.958 "seek_hole": false, 00:15:09.958 "seek_data": false, 00:15:09.958 "copy": true, 00:15:09.958 "nvme_iov_md": false 00:15:09.958 }, 00:15:09.958 "memory_domains": [ 00:15:09.958 { 00:15:09.958 "dma_device_id": "system", 00:15:09.958 "dma_device_type": 1 00:15:09.958 }, 00:15:09.958 { 00:15:09.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:09.958 "dma_device_type": 2 00:15:09.958 } 00:15:09.958 ], 00:15:09.958 "driver_specific": {} 00:15:09.958 } 00:15:09.958 ] 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.958 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.216 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.217 "name": "Existed_Raid", 00:15:10.217 "uuid": "ca2269e8-1fb0-407c-9e04-09791a119a27", 00:15:10.217 "strip_size_kb": 64, 00:15:10.217 "state": "configuring", 00:15:10.217 "raid_level": "raid0", 00:15:10.217 "superblock": true, 00:15:10.217 "num_base_bdevs": 3, 00:15:10.217 "num_base_bdevs_discovered": 1, 00:15:10.217 "num_base_bdevs_operational": 3, 00:15:10.217 "base_bdevs_list": [ 00:15:10.217 { 00:15:10.217 "name": "BaseBdev1", 00:15:10.217 "uuid": "1affe923-987e-48e8-b202-2aa51d65bb6c", 00:15:10.217 "is_configured": true, 00:15:10.217 "data_offset": 2048, 00:15:10.217 "data_size": 63488 00:15:10.217 }, 00:15:10.217 { 00:15:10.217 "name": "BaseBdev2", 00:15:10.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.217 "is_configured": false, 00:15:10.217 "data_offset": 0, 00:15:10.217 "data_size": 0 00:15:10.217 }, 00:15:10.217 { 00:15:10.217 "name": "BaseBdev3", 00:15:10.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.217 "is_configured": false, 00:15:10.217 "data_offset": 0, 00:15:10.217 "data_size": 0 00:15:10.217 } 00:15:10.217 ] 00:15:10.217 }' 00:15:10.217 11:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.217 11:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:10.784 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:11.042 [2024-07-15 11:56:24.536991] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:11.042 [2024-07-15 11:56:24.537031] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1384390 name Existed_Raid, state configuring 00:15:11.042 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:11.300 [2024-07-15 11:56:24.785698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:11.300 [2024-07-15 11:56:24.787158] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:11.300 [2024-07-15 11:56:24.787188] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:11.300 [2024-07-15 11:56:24.787198] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:11.300 [2024-07-15 11:56:24.787209] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.300 11:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.559 11:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.559 "name": "Existed_Raid", 00:15:11.559 "uuid": "00903472-80df-46e9-93b7-f667b0a8d622", 00:15:11.559 "strip_size_kb": 64, 00:15:11.559 "state": "configuring", 00:15:11.559 "raid_level": "raid0", 00:15:11.559 "superblock": true, 00:15:11.559 "num_base_bdevs": 3, 00:15:11.559 "num_base_bdevs_discovered": 1, 00:15:11.559 "num_base_bdevs_operational": 3, 00:15:11.559 "base_bdevs_list": [ 00:15:11.559 { 00:15:11.559 "name": "BaseBdev1", 00:15:11.559 "uuid": "1affe923-987e-48e8-b202-2aa51d65bb6c", 00:15:11.559 "is_configured": true, 00:15:11.559 "data_offset": 2048, 00:15:11.559 "data_size": 63488 00:15:11.559 }, 00:15:11.559 { 00:15:11.559 "name": "BaseBdev2", 00:15:11.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.559 "is_configured": false, 00:15:11.559 "data_offset": 0, 00:15:11.559 "data_size": 0 00:15:11.559 }, 00:15:11.559 { 00:15:11.559 "name": "BaseBdev3", 00:15:11.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.559 "is_configured": false, 00:15:11.559 "data_offset": 0, 00:15:11.559 "data_size": 0 00:15:11.559 } 00:15:11.559 ] 00:15:11.559 }' 00:15:11.559 11:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.559 11:56:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:12.126 11:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:12.384 [2024-07-15 11:56:25.915938] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:12.384 BaseBdev2 00:15:12.384 11:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:12.384 11:56:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:12.385 11:56:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:12.385 11:56:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:12.385 11:56:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:12.385 11:56:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:12.385 11:56:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:12.643 11:56:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:12.902 [ 00:15:12.902 { 00:15:12.902 "name": "BaseBdev2", 00:15:12.902 "aliases": [ 00:15:12.902 "ff0ac6ba-1eb0-4053-b9e7-2025f7e41f9e" 00:15:12.902 ], 00:15:12.902 "product_name": "Malloc disk", 00:15:12.902 "block_size": 512, 00:15:12.902 "num_blocks": 65536, 00:15:12.902 "uuid": "ff0ac6ba-1eb0-4053-b9e7-2025f7e41f9e", 00:15:12.902 "assigned_rate_limits": { 00:15:12.902 "rw_ios_per_sec": 0, 00:15:12.902 "rw_mbytes_per_sec": 0, 00:15:12.902 "r_mbytes_per_sec": 0, 00:15:12.902 "w_mbytes_per_sec": 0 00:15:12.902 }, 00:15:12.902 "claimed": true, 00:15:12.902 "claim_type": "exclusive_write", 00:15:12.902 "zoned": false, 00:15:12.902 "supported_io_types": { 00:15:12.902 "read": true, 00:15:12.902 "write": true, 00:15:12.902 "unmap": true, 00:15:12.902 "flush": true, 00:15:12.902 "reset": true, 00:15:12.902 "nvme_admin": false, 00:15:12.902 "nvme_io": false, 00:15:12.902 "nvme_io_md": false, 00:15:12.902 "write_zeroes": true, 00:15:12.902 "zcopy": true, 00:15:12.902 "get_zone_info": false, 00:15:12.902 "zone_management": false, 00:15:12.902 "zone_append": false, 00:15:12.902 "compare": false, 00:15:12.902 "compare_and_write": false, 00:15:12.902 "abort": true, 00:15:12.902 "seek_hole": false, 00:15:12.902 "seek_data": false, 00:15:12.902 "copy": true, 00:15:12.902 "nvme_iov_md": false 00:15:12.902 }, 00:15:12.902 "memory_domains": [ 00:15:12.902 { 00:15:12.902 "dma_device_id": "system", 00:15:12.902 "dma_device_type": 1 00:15:12.902 }, 00:15:12.902 { 00:15:12.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:12.902 "dma_device_type": 2 00:15:12.902 } 00:15:12.902 ], 00:15:12.902 "driver_specific": {} 00:15:12.902 } 00:15:12.902 ] 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.902 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.161 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.161 "name": "Existed_Raid", 00:15:13.161 "uuid": "00903472-80df-46e9-93b7-f667b0a8d622", 00:15:13.161 "strip_size_kb": 64, 00:15:13.161 "state": "configuring", 00:15:13.161 "raid_level": "raid0", 00:15:13.161 "superblock": true, 00:15:13.161 "num_base_bdevs": 3, 00:15:13.161 "num_base_bdevs_discovered": 2, 00:15:13.161 "num_base_bdevs_operational": 3, 00:15:13.161 "base_bdevs_list": [ 00:15:13.161 { 00:15:13.161 "name": "BaseBdev1", 00:15:13.161 "uuid": "1affe923-987e-48e8-b202-2aa51d65bb6c", 00:15:13.161 "is_configured": true, 00:15:13.161 "data_offset": 2048, 00:15:13.161 "data_size": 63488 00:15:13.161 }, 00:15:13.161 { 00:15:13.161 "name": "BaseBdev2", 00:15:13.161 "uuid": "ff0ac6ba-1eb0-4053-b9e7-2025f7e41f9e", 00:15:13.161 "is_configured": true, 00:15:13.161 "data_offset": 2048, 00:15:13.161 "data_size": 63488 00:15:13.161 }, 00:15:13.161 { 00:15:13.161 "name": "BaseBdev3", 00:15:13.161 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.161 "is_configured": false, 00:15:13.161 "data_offset": 0, 00:15:13.161 "data_size": 0 00:15:13.161 } 00:15:13.161 ] 00:15:13.161 }' 00:15:13.161 11:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.161 11:56:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:13.728 11:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:13.987 [2024-07-15 11:56:27.483556] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:13.987 [2024-07-15 11:56:27.483723] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1385480 00:15:13.987 [2024-07-15 11:56:27.483738] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:13.987 [2024-07-15 11:56:27.483906] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x108b5d0 00:15:13.987 [2024-07-15 11:56:27.484021] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1385480 00:15:13.987 [2024-07-15 11:56:27.484031] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1385480 00:15:13.987 [2024-07-15 11:56:27.484120] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:13.987 BaseBdev3 00:15:13.987 11:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:13.987 11:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:13.987 11:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:13.987 11:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:13.987 11:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:13.987 11:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:13.987 11:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:14.246 11:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:14.505 [ 00:15:14.505 { 00:15:14.506 "name": "BaseBdev3", 00:15:14.506 "aliases": [ 00:15:14.506 "70cf66d9-79d7-49b4-971f-5d6cc7fbdffb" 00:15:14.506 ], 00:15:14.506 "product_name": "Malloc disk", 00:15:14.506 "block_size": 512, 00:15:14.506 "num_blocks": 65536, 00:15:14.506 "uuid": "70cf66d9-79d7-49b4-971f-5d6cc7fbdffb", 00:15:14.506 "assigned_rate_limits": { 00:15:14.506 "rw_ios_per_sec": 0, 00:15:14.506 "rw_mbytes_per_sec": 0, 00:15:14.506 "r_mbytes_per_sec": 0, 00:15:14.506 "w_mbytes_per_sec": 0 00:15:14.506 }, 00:15:14.506 "claimed": true, 00:15:14.506 "claim_type": "exclusive_write", 00:15:14.506 "zoned": false, 00:15:14.506 "supported_io_types": { 00:15:14.506 "read": true, 00:15:14.506 "write": true, 00:15:14.506 "unmap": true, 00:15:14.506 "flush": true, 00:15:14.506 "reset": true, 00:15:14.506 "nvme_admin": false, 00:15:14.506 "nvme_io": false, 00:15:14.506 "nvme_io_md": false, 00:15:14.506 "write_zeroes": true, 00:15:14.506 "zcopy": true, 00:15:14.506 "get_zone_info": false, 00:15:14.506 "zone_management": false, 00:15:14.506 "zone_append": false, 00:15:14.506 "compare": false, 00:15:14.506 "compare_and_write": false, 00:15:14.506 "abort": true, 00:15:14.506 "seek_hole": false, 00:15:14.506 "seek_data": false, 00:15:14.506 "copy": true, 00:15:14.506 "nvme_iov_md": false 00:15:14.506 }, 00:15:14.506 "memory_domains": [ 00:15:14.506 { 00:15:14.506 "dma_device_id": "system", 00:15:14.506 "dma_device_type": 1 00:15:14.506 }, 00:15:14.506 { 00:15:14.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.506 "dma_device_type": 2 00:15:14.506 } 00:15:14.506 ], 00:15:14.506 "driver_specific": {} 00:15:14.506 } 00:15:14.506 ] 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.506 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.765 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.765 "name": "Existed_Raid", 00:15:14.765 "uuid": "00903472-80df-46e9-93b7-f667b0a8d622", 00:15:14.765 "strip_size_kb": 64, 00:15:14.765 "state": "online", 00:15:14.765 "raid_level": "raid0", 00:15:14.765 "superblock": true, 00:15:14.765 "num_base_bdevs": 3, 00:15:14.765 "num_base_bdevs_discovered": 3, 00:15:14.765 "num_base_bdevs_operational": 3, 00:15:14.765 "base_bdevs_list": [ 00:15:14.765 { 00:15:14.765 "name": "BaseBdev1", 00:15:14.765 "uuid": "1affe923-987e-48e8-b202-2aa51d65bb6c", 00:15:14.765 "is_configured": true, 00:15:14.765 "data_offset": 2048, 00:15:14.765 "data_size": 63488 00:15:14.765 }, 00:15:14.765 { 00:15:14.765 "name": "BaseBdev2", 00:15:14.765 "uuid": "ff0ac6ba-1eb0-4053-b9e7-2025f7e41f9e", 00:15:14.765 "is_configured": true, 00:15:14.765 "data_offset": 2048, 00:15:14.765 "data_size": 63488 00:15:14.765 }, 00:15:14.765 { 00:15:14.765 "name": "BaseBdev3", 00:15:14.765 "uuid": "70cf66d9-79d7-49b4-971f-5d6cc7fbdffb", 00:15:14.765 "is_configured": true, 00:15:14.765 "data_offset": 2048, 00:15:14.765 "data_size": 63488 00:15:14.765 } 00:15:14.765 ] 00:15:14.765 }' 00:15:14.765 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.765 11:56:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:15.332 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:15.332 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:15.332 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:15.332 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:15.332 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:15.332 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:15.332 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:15.332 11:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:15.591 [2024-07-15 11:56:29.076066] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:15.591 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:15.591 "name": "Existed_Raid", 00:15:15.591 "aliases": [ 00:15:15.591 "00903472-80df-46e9-93b7-f667b0a8d622" 00:15:15.591 ], 00:15:15.591 "product_name": "Raid Volume", 00:15:15.591 "block_size": 512, 00:15:15.591 "num_blocks": 190464, 00:15:15.591 "uuid": "00903472-80df-46e9-93b7-f667b0a8d622", 00:15:15.591 "assigned_rate_limits": { 00:15:15.591 "rw_ios_per_sec": 0, 00:15:15.591 "rw_mbytes_per_sec": 0, 00:15:15.591 "r_mbytes_per_sec": 0, 00:15:15.591 "w_mbytes_per_sec": 0 00:15:15.591 }, 00:15:15.591 "claimed": false, 00:15:15.591 "zoned": false, 00:15:15.591 "supported_io_types": { 00:15:15.591 "read": true, 00:15:15.591 "write": true, 00:15:15.591 "unmap": true, 00:15:15.591 "flush": true, 00:15:15.591 "reset": true, 00:15:15.591 "nvme_admin": false, 00:15:15.591 "nvme_io": false, 00:15:15.591 "nvme_io_md": false, 00:15:15.591 "write_zeroes": true, 00:15:15.591 "zcopy": false, 00:15:15.591 "get_zone_info": false, 00:15:15.591 "zone_management": false, 00:15:15.591 "zone_append": false, 00:15:15.591 "compare": false, 00:15:15.591 "compare_and_write": false, 00:15:15.591 "abort": false, 00:15:15.591 "seek_hole": false, 00:15:15.591 "seek_data": false, 00:15:15.591 "copy": false, 00:15:15.591 "nvme_iov_md": false 00:15:15.591 }, 00:15:15.591 "memory_domains": [ 00:15:15.591 { 00:15:15.591 "dma_device_id": "system", 00:15:15.591 "dma_device_type": 1 00:15:15.591 }, 00:15:15.591 { 00:15:15.591 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.591 "dma_device_type": 2 00:15:15.591 }, 00:15:15.591 { 00:15:15.591 "dma_device_id": "system", 00:15:15.591 "dma_device_type": 1 00:15:15.591 }, 00:15:15.591 { 00:15:15.591 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.591 "dma_device_type": 2 00:15:15.591 }, 00:15:15.591 { 00:15:15.591 "dma_device_id": "system", 00:15:15.591 "dma_device_type": 1 00:15:15.591 }, 00:15:15.591 { 00:15:15.591 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.591 "dma_device_type": 2 00:15:15.591 } 00:15:15.591 ], 00:15:15.591 "driver_specific": { 00:15:15.591 "raid": { 00:15:15.591 "uuid": "00903472-80df-46e9-93b7-f667b0a8d622", 00:15:15.591 "strip_size_kb": 64, 00:15:15.591 "state": "online", 00:15:15.591 "raid_level": "raid0", 00:15:15.591 "superblock": true, 00:15:15.591 "num_base_bdevs": 3, 00:15:15.591 "num_base_bdevs_discovered": 3, 00:15:15.591 "num_base_bdevs_operational": 3, 00:15:15.591 "base_bdevs_list": [ 00:15:15.591 { 00:15:15.591 "name": "BaseBdev1", 00:15:15.591 "uuid": "1affe923-987e-48e8-b202-2aa51d65bb6c", 00:15:15.591 "is_configured": true, 00:15:15.591 "data_offset": 2048, 00:15:15.591 "data_size": 63488 00:15:15.591 }, 00:15:15.591 { 00:15:15.591 "name": "BaseBdev2", 00:15:15.591 "uuid": "ff0ac6ba-1eb0-4053-b9e7-2025f7e41f9e", 00:15:15.591 "is_configured": true, 00:15:15.591 "data_offset": 2048, 00:15:15.591 "data_size": 63488 00:15:15.591 }, 00:15:15.591 { 00:15:15.591 "name": "BaseBdev3", 00:15:15.591 "uuid": "70cf66d9-79d7-49b4-971f-5d6cc7fbdffb", 00:15:15.591 "is_configured": true, 00:15:15.591 "data_offset": 2048, 00:15:15.591 "data_size": 63488 00:15:15.591 } 00:15:15.591 ] 00:15:15.591 } 00:15:15.591 } 00:15:15.591 }' 00:15:15.591 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:15.591 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:15.591 BaseBdev2 00:15:15.591 BaseBdev3' 00:15:15.591 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:15.591 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:15.591 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:15.894 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:15.894 "name": "BaseBdev1", 00:15:15.894 "aliases": [ 00:15:15.894 "1affe923-987e-48e8-b202-2aa51d65bb6c" 00:15:15.894 ], 00:15:15.894 "product_name": "Malloc disk", 00:15:15.894 "block_size": 512, 00:15:15.894 "num_blocks": 65536, 00:15:15.894 "uuid": "1affe923-987e-48e8-b202-2aa51d65bb6c", 00:15:15.894 "assigned_rate_limits": { 00:15:15.894 "rw_ios_per_sec": 0, 00:15:15.894 "rw_mbytes_per_sec": 0, 00:15:15.894 "r_mbytes_per_sec": 0, 00:15:15.894 "w_mbytes_per_sec": 0 00:15:15.894 }, 00:15:15.894 "claimed": true, 00:15:15.894 "claim_type": "exclusive_write", 00:15:15.894 "zoned": false, 00:15:15.894 "supported_io_types": { 00:15:15.894 "read": true, 00:15:15.894 "write": true, 00:15:15.894 "unmap": true, 00:15:15.894 "flush": true, 00:15:15.894 "reset": true, 00:15:15.894 "nvme_admin": false, 00:15:15.894 "nvme_io": false, 00:15:15.894 "nvme_io_md": false, 00:15:15.894 "write_zeroes": true, 00:15:15.894 "zcopy": true, 00:15:15.894 "get_zone_info": false, 00:15:15.894 "zone_management": false, 00:15:15.894 "zone_append": false, 00:15:15.894 "compare": false, 00:15:15.894 "compare_and_write": false, 00:15:15.894 "abort": true, 00:15:15.894 "seek_hole": false, 00:15:15.894 "seek_data": false, 00:15:15.894 "copy": true, 00:15:15.894 "nvme_iov_md": false 00:15:15.894 }, 00:15:15.894 "memory_domains": [ 00:15:15.894 { 00:15:15.894 "dma_device_id": "system", 00:15:15.894 "dma_device_type": 1 00:15:15.894 }, 00:15:15.894 { 00:15:15.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.894 "dma_device_type": 2 00:15:15.894 } 00:15:15.894 ], 00:15:15.894 "driver_specific": {} 00:15:15.894 }' 00:15:15.894 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.894 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.152 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:16.152 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.152 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.152 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:16.152 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.152 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.411 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:16.411 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.411 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.411 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:16.411 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:16.411 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:16.411 11:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:16.669 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:16.669 "name": "BaseBdev2", 00:15:16.669 "aliases": [ 00:15:16.669 "ff0ac6ba-1eb0-4053-b9e7-2025f7e41f9e" 00:15:16.669 ], 00:15:16.669 "product_name": "Malloc disk", 00:15:16.669 "block_size": 512, 00:15:16.669 "num_blocks": 65536, 00:15:16.669 "uuid": "ff0ac6ba-1eb0-4053-b9e7-2025f7e41f9e", 00:15:16.669 "assigned_rate_limits": { 00:15:16.669 "rw_ios_per_sec": 0, 00:15:16.669 "rw_mbytes_per_sec": 0, 00:15:16.669 "r_mbytes_per_sec": 0, 00:15:16.669 "w_mbytes_per_sec": 0 00:15:16.669 }, 00:15:16.669 "claimed": true, 00:15:16.669 "claim_type": "exclusive_write", 00:15:16.669 "zoned": false, 00:15:16.669 "supported_io_types": { 00:15:16.669 "read": true, 00:15:16.669 "write": true, 00:15:16.669 "unmap": true, 00:15:16.669 "flush": true, 00:15:16.669 "reset": true, 00:15:16.669 "nvme_admin": false, 00:15:16.669 "nvme_io": false, 00:15:16.669 "nvme_io_md": false, 00:15:16.669 "write_zeroes": true, 00:15:16.669 "zcopy": true, 00:15:16.669 "get_zone_info": false, 00:15:16.669 "zone_management": false, 00:15:16.669 "zone_append": false, 00:15:16.669 "compare": false, 00:15:16.669 "compare_and_write": false, 00:15:16.669 "abort": true, 00:15:16.669 "seek_hole": false, 00:15:16.669 "seek_data": false, 00:15:16.669 "copy": true, 00:15:16.669 "nvme_iov_md": false 00:15:16.669 }, 00:15:16.669 "memory_domains": [ 00:15:16.669 { 00:15:16.669 "dma_device_id": "system", 00:15:16.669 "dma_device_type": 1 00:15:16.669 }, 00:15:16.669 { 00:15:16.669 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.669 "dma_device_type": 2 00:15:16.669 } 00:15:16.669 ], 00:15:16.669 "driver_specific": {} 00:15:16.669 }' 00:15:16.669 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.669 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.669 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:16.669 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.669 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:16.926 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:17.184 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:17.184 "name": "BaseBdev3", 00:15:17.184 "aliases": [ 00:15:17.184 "70cf66d9-79d7-49b4-971f-5d6cc7fbdffb" 00:15:17.184 ], 00:15:17.184 "product_name": "Malloc disk", 00:15:17.184 "block_size": 512, 00:15:17.184 "num_blocks": 65536, 00:15:17.184 "uuid": "70cf66d9-79d7-49b4-971f-5d6cc7fbdffb", 00:15:17.184 "assigned_rate_limits": { 00:15:17.184 "rw_ios_per_sec": 0, 00:15:17.184 "rw_mbytes_per_sec": 0, 00:15:17.184 "r_mbytes_per_sec": 0, 00:15:17.184 "w_mbytes_per_sec": 0 00:15:17.184 }, 00:15:17.184 "claimed": true, 00:15:17.184 "claim_type": "exclusive_write", 00:15:17.184 "zoned": false, 00:15:17.184 "supported_io_types": { 00:15:17.184 "read": true, 00:15:17.184 "write": true, 00:15:17.184 "unmap": true, 00:15:17.184 "flush": true, 00:15:17.184 "reset": true, 00:15:17.184 "nvme_admin": false, 00:15:17.184 "nvme_io": false, 00:15:17.184 "nvme_io_md": false, 00:15:17.184 "write_zeroes": true, 00:15:17.184 "zcopy": true, 00:15:17.184 "get_zone_info": false, 00:15:17.184 "zone_management": false, 00:15:17.184 "zone_append": false, 00:15:17.184 "compare": false, 00:15:17.184 "compare_and_write": false, 00:15:17.184 "abort": true, 00:15:17.184 "seek_hole": false, 00:15:17.184 "seek_data": false, 00:15:17.184 "copy": true, 00:15:17.184 "nvme_iov_md": false 00:15:17.184 }, 00:15:17.184 "memory_domains": [ 00:15:17.184 { 00:15:17.184 "dma_device_id": "system", 00:15:17.184 "dma_device_type": 1 00:15:17.184 }, 00:15:17.184 { 00:15:17.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.184 "dma_device_type": 2 00:15:17.184 } 00:15:17.184 ], 00:15:17.184 "driver_specific": {} 00:15:17.184 }' 00:15:17.184 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:17.184 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:17.450 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:17.450 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:17.450 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:17.450 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:17.450 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.450 11:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.450 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:17.450 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:17.710 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:17.710 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:17.710 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:17.969 [2024-07-15 11:56:31.337827] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:17.969 [2024-07-15 11:56:31.337851] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:17.969 [2024-07-15 11:56:31.337891] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:17.969 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.228 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.228 "name": "Existed_Raid", 00:15:18.228 "uuid": "00903472-80df-46e9-93b7-f667b0a8d622", 00:15:18.228 "strip_size_kb": 64, 00:15:18.228 "state": "offline", 00:15:18.228 "raid_level": "raid0", 00:15:18.228 "superblock": true, 00:15:18.228 "num_base_bdevs": 3, 00:15:18.228 "num_base_bdevs_discovered": 2, 00:15:18.228 "num_base_bdevs_operational": 2, 00:15:18.228 "base_bdevs_list": [ 00:15:18.228 { 00:15:18.228 "name": null, 00:15:18.228 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.228 "is_configured": false, 00:15:18.228 "data_offset": 2048, 00:15:18.228 "data_size": 63488 00:15:18.228 }, 00:15:18.228 { 00:15:18.228 "name": "BaseBdev2", 00:15:18.228 "uuid": "ff0ac6ba-1eb0-4053-b9e7-2025f7e41f9e", 00:15:18.228 "is_configured": true, 00:15:18.228 "data_offset": 2048, 00:15:18.228 "data_size": 63488 00:15:18.228 }, 00:15:18.228 { 00:15:18.228 "name": "BaseBdev3", 00:15:18.228 "uuid": "70cf66d9-79d7-49b4-971f-5d6cc7fbdffb", 00:15:18.228 "is_configured": true, 00:15:18.228 "data_offset": 2048, 00:15:18.228 "data_size": 63488 00:15:18.228 } 00:15:18.228 ] 00:15:18.228 }' 00:15:18.228 11:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.228 11:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:18.796 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:18.796 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:18.796 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.796 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:19.055 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:19.055 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:19.055 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:19.314 [2024-07-15 11:56:32.895146] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:19.574 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:19.574 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:19.574 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.574 11:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:19.832 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:19.832 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:19.832 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:20.399 [2024-07-15 11:56:33.705381] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:20.399 [2024-07-15 11:56:33.705425] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1385480 name Existed_Raid, state offline 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:20.399 11:56:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:20.967 BaseBdev2 00:15:20.967 11:56:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:20.967 11:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:15:20.967 11:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:20.967 11:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:20.967 11:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:20.967 11:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:20.968 11:56:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:21.594 11:56:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:22.162 [ 00:15:22.162 { 00:15:22.162 "name": "BaseBdev2", 00:15:22.162 "aliases": [ 00:15:22.162 "d75b8261-fab5-470f-a3db-25c16ea24d34" 00:15:22.162 ], 00:15:22.162 "product_name": "Malloc disk", 00:15:22.162 "block_size": 512, 00:15:22.162 "num_blocks": 65536, 00:15:22.162 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:22.162 "assigned_rate_limits": { 00:15:22.162 "rw_ios_per_sec": 0, 00:15:22.162 "rw_mbytes_per_sec": 0, 00:15:22.162 "r_mbytes_per_sec": 0, 00:15:22.162 "w_mbytes_per_sec": 0 00:15:22.162 }, 00:15:22.162 "claimed": false, 00:15:22.162 "zoned": false, 00:15:22.162 "supported_io_types": { 00:15:22.162 "read": true, 00:15:22.162 "write": true, 00:15:22.162 "unmap": true, 00:15:22.162 "flush": true, 00:15:22.162 "reset": true, 00:15:22.162 "nvme_admin": false, 00:15:22.162 "nvme_io": false, 00:15:22.162 "nvme_io_md": false, 00:15:22.162 "write_zeroes": true, 00:15:22.162 "zcopy": true, 00:15:22.162 "get_zone_info": false, 00:15:22.162 "zone_management": false, 00:15:22.162 "zone_append": false, 00:15:22.162 "compare": false, 00:15:22.162 "compare_and_write": false, 00:15:22.162 "abort": true, 00:15:22.162 "seek_hole": false, 00:15:22.162 "seek_data": false, 00:15:22.162 "copy": true, 00:15:22.162 "nvme_iov_md": false 00:15:22.162 }, 00:15:22.162 "memory_domains": [ 00:15:22.162 { 00:15:22.162 "dma_device_id": "system", 00:15:22.162 "dma_device_type": 1 00:15:22.162 }, 00:15:22.162 { 00:15:22.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.162 "dma_device_type": 2 00:15:22.162 } 00:15:22.162 ], 00:15:22.162 "driver_specific": {} 00:15:22.162 } 00:15:22.162 ] 00:15:22.162 11:56:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:22.162 11:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:22.162 11:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:22.162 11:56:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:22.421 BaseBdev3 00:15:22.681 11:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:22.681 11:56:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:15:22.681 11:56:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:22.681 11:56:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:22.681 11:56:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:22.681 11:56:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:22.681 11:56:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:22.940 11:56:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:23.198 [ 00:15:23.198 { 00:15:23.198 "name": "BaseBdev3", 00:15:23.198 "aliases": [ 00:15:23.198 "3de28c80-64c2-44ca-9503-327095903339" 00:15:23.198 ], 00:15:23.198 "product_name": "Malloc disk", 00:15:23.198 "block_size": 512, 00:15:23.198 "num_blocks": 65536, 00:15:23.198 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:23.198 "assigned_rate_limits": { 00:15:23.198 "rw_ios_per_sec": 0, 00:15:23.198 "rw_mbytes_per_sec": 0, 00:15:23.198 "r_mbytes_per_sec": 0, 00:15:23.198 "w_mbytes_per_sec": 0 00:15:23.198 }, 00:15:23.198 "claimed": false, 00:15:23.198 "zoned": false, 00:15:23.198 "supported_io_types": { 00:15:23.198 "read": true, 00:15:23.198 "write": true, 00:15:23.198 "unmap": true, 00:15:23.198 "flush": true, 00:15:23.198 "reset": true, 00:15:23.198 "nvme_admin": false, 00:15:23.198 "nvme_io": false, 00:15:23.198 "nvme_io_md": false, 00:15:23.198 "write_zeroes": true, 00:15:23.198 "zcopy": true, 00:15:23.198 "get_zone_info": false, 00:15:23.198 "zone_management": false, 00:15:23.198 "zone_append": false, 00:15:23.198 "compare": false, 00:15:23.198 "compare_and_write": false, 00:15:23.198 "abort": true, 00:15:23.198 "seek_hole": false, 00:15:23.198 "seek_data": false, 00:15:23.198 "copy": true, 00:15:23.198 "nvme_iov_md": false 00:15:23.198 }, 00:15:23.198 "memory_domains": [ 00:15:23.198 { 00:15:23.198 "dma_device_id": "system", 00:15:23.198 "dma_device_type": 1 00:15:23.198 }, 00:15:23.198 { 00:15:23.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.198 "dma_device_type": 2 00:15:23.198 } 00:15:23.198 ], 00:15:23.198 "driver_specific": {} 00:15:23.198 } 00:15:23.198 ] 00:15:23.456 11:56:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:23.456 11:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:23.456 11:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:23.457 11:56:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:23.715 [2024-07-15 11:56:37.283729] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:23.715 [2024-07-15 11:56:37.283771] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:23.715 [2024-07-15 11:56:37.283789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:23.715 [2024-07-15 11:56:37.285371] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.975 "name": "Existed_Raid", 00:15:23.975 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:23.975 "strip_size_kb": 64, 00:15:23.975 "state": "configuring", 00:15:23.975 "raid_level": "raid0", 00:15:23.975 "superblock": true, 00:15:23.975 "num_base_bdevs": 3, 00:15:23.975 "num_base_bdevs_discovered": 2, 00:15:23.975 "num_base_bdevs_operational": 3, 00:15:23.975 "base_bdevs_list": [ 00:15:23.975 { 00:15:23.975 "name": "BaseBdev1", 00:15:23.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.975 "is_configured": false, 00:15:23.975 "data_offset": 0, 00:15:23.975 "data_size": 0 00:15:23.975 }, 00:15:23.975 { 00:15:23.975 "name": "BaseBdev2", 00:15:23.975 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:23.975 "is_configured": true, 00:15:23.975 "data_offset": 2048, 00:15:23.975 "data_size": 63488 00:15:23.975 }, 00:15:23.975 { 00:15:23.975 "name": "BaseBdev3", 00:15:23.975 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:23.975 "is_configured": true, 00:15:23.975 "data_offset": 2048, 00:15:23.975 "data_size": 63488 00:15:23.975 } 00:15:23.975 ] 00:15:23.975 }' 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.975 11:56:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:25.353 [2024-07-15 11:56:38.880047] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.353 11:56:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.613 11:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.613 "name": "Existed_Raid", 00:15:25.613 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:25.613 "strip_size_kb": 64, 00:15:25.613 "state": "configuring", 00:15:25.613 "raid_level": "raid0", 00:15:25.613 "superblock": true, 00:15:25.613 "num_base_bdevs": 3, 00:15:25.613 "num_base_bdevs_discovered": 1, 00:15:25.613 "num_base_bdevs_operational": 3, 00:15:25.613 "base_bdevs_list": [ 00:15:25.613 { 00:15:25.613 "name": "BaseBdev1", 00:15:25.613 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:25.613 "is_configured": false, 00:15:25.613 "data_offset": 0, 00:15:25.613 "data_size": 0 00:15:25.613 }, 00:15:25.613 { 00:15:25.613 "name": null, 00:15:25.613 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:25.613 "is_configured": false, 00:15:25.613 "data_offset": 2048, 00:15:25.613 "data_size": 63488 00:15:25.613 }, 00:15:25.613 { 00:15:25.613 "name": "BaseBdev3", 00:15:25.613 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:25.613 "is_configured": true, 00:15:25.613 "data_offset": 2048, 00:15:25.613 "data_size": 63488 00:15:25.613 } 00:15:25.613 ] 00:15:25.613 }' 00:15:25.613 11:56:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.613 11:56:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:26.992 11:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.992 11:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:26.992 11:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:26.992 11:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:27.251 [2024-07-15 11:56:40.685351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:27.251 BaseBdev1 00:15:27.251 11:56:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:27.251 11:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:15:27.251 11:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:27.251 11:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:27.251 11:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:27.251 11:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:27.251 11:56:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:27.819 11:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:28.078 [ 00:15:28.078 { 00:15:28.078 "name": "BaseBdev1", 00:15:28.078 "aliases": [ 00:15:28.078 "962e84f9-2df9-4905-98a3-7143da6a7e7c" 00:15:28.078 ], 00:15:28.078 "product_name": "Malloc disk", 00:15:28.078 "block_size": 512, 00:15:28.078 "num_blocks": 65536, 00:15:28.078 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:28.078 "assigned_rate_limits": { 00:15:28.078 "rw_ios_per_sec": 0, 00:15:28.078 "rw_mbytes_per_sec": 0, 00:15:28.078 "r_mbytes_per_sec": 0, 00:15:28.078 "w_mbytes_per_sec": 0 00:15:28.078 }, 00:15:28.078 "claimed": true, 00:15:28.078 "claim_type": "exclusive_write", 00:15:28.078 "zoned": false, 00:15:28.078 "supported_io_types": { 00:15:28.078 "read": true, 00:15:28.078 "write": true, 00:15:28.078 "unmap": true, 00:15:28.078 "flush": true, 00:15:28.078 "reset": true, 00:15:28.078 "nvme_admin": false, 00:15:28.078 "nvme_io": false, 00:15:28.078 "nvme_io_md": false, 00:15:28.078 "write_zeroes": true, 00:15:28.078 "zcopy": true, 00:15:28.078 "get_zone_info": false, 00:15:28.078 "zone_management": false, 00:15:28.078 "zone_append": false, 00:15:28.078 "compare": false, 00:15:28.078 "compare_and_write": false, 00:15:28.078 "abort": true, 00:15:28.078 "seek_hole": false, 00:15:28.078 "seek_data": false, 00:15:28.078 "copy": true, 00:15:28.078 "nvme_iov_md": false 00:15:28.078 }, 00:15:28.079 "memory_domains": [ 00:15:28.079 { 00:15:28.079 "dma_device_id": "system", 00:15:28.079 "dma_device_type": 1 00:15:28.079 }, 00:15:28.079 { 00:15:28.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:28.079 "dma_device_type": 2 00:15:28.079 } 00:15:28.079 ], 00:15:28.079 "driver_specific": {} 00:15:28.079 } 00:15:28.079 ] 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.079 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.338 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.338 "name": "Existed_Raid", 00:15:28.338 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:28.338 "strip_size_kb": 64, 00:15:28.338 "state": "configuring", 00:15:28.338 "raid_level": "raid0", 00:15:28.338 "superblock": true, 00:15:28.338 "num_base_bdevs": 3, 00:15:28.338 "num_base_bdevs_discovered": 2, 00:15:28.338 "num_base_bdevs_operational": 3, 00:15:28.338 "base_bdevs_list": [ 00:15:28.338 { 00:15:28.338 "name": "BaseBdev1", 00:15:28.338 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:28.338 "is_configured": true, 00:15:28.338 "data_offset": 2048, 00:15:28.338 "data_size": 63488 00:15:28.338 }, 00:15:28.338 { 00:15:28.338 "name": null, 00:15:28.338 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:28.338 "is_configured": false, 00:15:28.338 "data_offset": 2048, 00:15:28.338 "data_size": 63488 00:15:28.338 }, 00:15:28.338 { 00:15:28.338 "name": "BaseBdev3", 00:15:28.338 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:28.338 "is_configured": true, 00:15:28.338 "data_offset": 2048, 00:15:28.338 "data_size": 63488 00:15:28.338 } 00:15:28.338 ] 00:15:28.338 }' 00:15:28.338 11:56:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.338 11:56:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:28.904 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:28.904 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.163 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:29.163 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:29.421 [2024-07-15 11:56:42.851111] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.421 11:56:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:29.679 11:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.679 "name": "Existed_Raid", 00:15:29.679 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:29.679 "strip_size_kb": 64, 00:15:29.679 "state": "configuring", 00:15:29.679 "raid_level": "raid0", 00:15:29.679 "superblock": true, 00:15:29.679 "num_base_bdevs": 3, 00:15:29.679 "num_base_bdevs_discovered": 1, 00:15:29.679 "num_base_bdevs_operational": 3, 00:15:29.679 "base_bdevs_list": [ 00:15:29.679 { 00:15:29.679 "name": "BaseBdev1", 00:15:29.679 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:29.679 "is_configured": true, 00:15:29.679 "data_offset": 2048, 00:15:29.679 "data_size": 63488 00:15:29.679 }, 00:15:29.679 { 00:15:29.679 "name": null, 00:15:29.679 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:29.679 "is_configured": false, 00:15:29.679 "data_offset": 2048, 00:15:29.679 "data_size": 63488 00:15:29.679 }, 00:15:29.679 { 00:15:29.679 "name": null, 00:15:29.679 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:29.679 "is_configured": false, 00:15:29.679 "data_offset": 2048, 00:15:29.679 "data_size": 63488 00:15:29.679 } 00:15:29.679 ] 00:15:29.679 }' 00:15:29.679 11:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.679 11:56:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:30.246 11:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.246 11:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:30.505 11:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:30.505 11:56:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:30.764 [2024-07-15 11:56:44.194698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.764 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.023 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.023 "name": "Existed_Raid", 00:15:31.023 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:31.023 "strip_size_kb": 64, 00:15:31.023 "state": "configuring", 00:15:31.023 "raid_level": "raid0", 00:15:31.023 "superblock": true, 00:15:31.023 "num_base_bdevs": 3, 00:15:31.023 "num_base_bdevs_discovered": 2, 00:15:31.023 "num_base_bdevs_operational": 3, 00:15:31.023 "base_bdevs_list": [ 00:15:31.023 { 00:15:31.023 "name": "BaseBdev1", 00:15:31.023 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:31.023 "is_configured": true, 00:15:31.023 "data_offset": 2048, 00:15:31.023 "data_size": 63488 00:15:31.023 }, 00:15:31.023 { 00:15:31.023 "name": null, 00:15:31.023 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:31.023 "is_configured": false, 00:15:31.023 "data_offset": 2048, 00:15:31.023 "data_size": 63488 00:15:31.023 }, 00:15:31.023 { 00:15:31.023 "name": "BaseBdev3", 00:15:31.023 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:31.023 "is_configured": true, 00:15:31.023 "data_offset": 2048, 00:15:31.023 "data_size": 63488 00:15:31.023 } 00:15:31.023 ] 00:15:31.023 }' 00:15:31.023 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.023 11:56:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:31.589 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.589 11:56:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:31.589 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:31.589 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:31.847 [2024-07-15 11:56:45.405913] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.847 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.108 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.108 "name": "Existed_Raid", 00:15:32.108 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:32.108 "strip_size_kb": 64, 00:15:32.108 "state": "configuring", 00:15:32.108 "raid_level": "raid0", 00:15:32.108 "superblock": true, 00:15:32.108 "num_base_bdevs": 3, 00:15:32.108 "num_base_bdevs_discovered": 1, 00:15:32.108 "num_base_bdevs_operational": 3, 00:15:32.108 "base_bdevs_list": [ 00:15:32.108 { 00:15:32.108 "name": null, 00:15:32.108 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:32.108 "is_configured": false, 00:15:32.108 "data_offset": 2048, 00:15:32.108 "data_size": 63488 00:15:32.108 }, 00:15:32.108 { 00:15:32.108 "name": null, 00:15:32.108 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:32.108 "is_configured": false, 00:15:32.108 "data_offset": 2048, 00:15:32.108 "data_size": 63488 00:15:32.108 }, 00:15:32.108 { 00:15:32.108 "name": "BaseBdev3", 00:15:32.108 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:32.108 "is_configured": true, 00:15:32.108 "data_offset": 2048, 00:15:32.108 "data_size": 63488 00:15:32.108 } 00:15:32.108 ] 00:15:32.108 }' 00:15:32.108 11:56:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.108 11:56:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:33.043 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.043 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:33.043 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:33.043 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:33.301 [2024-07-15 11:56:46.757894] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.301 11:56:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:33.559 11:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:33.559 "name": "Existed_Raid", 00:15:33.559 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:33.559 "strip_size_kb": 64, 00:15:33.559 "state": "configuring", 00:15:33.559 "raid_level": "raid0", 00:15:33.559 "superblock": true, 00:15:33.559 "num_base_bdevs": 3, 00:15:33.559 "num_base_bdevs_discovered": 2, 00:15:33.559 "num_base_bdevs_operational": 3, 00:15:33.559 "base_bdevs_list": [ 00:15:33.559 { 00:15:33.559 "name": null, 00:15:33.559 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:33.559 "is_configured": false, 00:15:33.559 "data_offset": 2048, 00:15:33.559 "data_size": 63488 00:15:33.559 }, 00:15:33.559 { 00:15:33.559 "name": "BaseBdev2", 00:15:33.559 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:33.559 "is_configured": true, 00:15:33.559 "data_offset": 2048, 00:15:33.559 "data_size": 63488 00:15:33.559 }, 00:15:33.559 { 00:15:33.559 "name": "BaseBdev3", 00:15:33.559 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:33.559 "is_configured": true, 00:15:33.559 "data_offset": 2048, 00:15:33.559 "data_size": 63488 00:15:33.559 } 00:15:33.559 ] 00:15:33.559 }' 00:15:33.559 11:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:33.559 11:56:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:34.126 11:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.126 11:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:34.385 11:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:34.385 11:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.385 11:56:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:34.642 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 962e84f9-2df9-4905-98a3-7143da6a7e7c 00:15:34.901 [2024-07-15 11:56:48.350624] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:34.901 [2024-07-15 11:56:48.350778] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1385c10 00:15:34.901 [2024-07-15 11:56:48.350792] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:34.901 [2024-07-15 11:56:48.350964] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13878f0 00:15:34.901 [2024-07-15 11:56:48.351088] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1385c10 00:15:34.901 [2024-07-15 11:56:48.351098] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1385c10 00:15:34.901 [2024-07-15 11:56:48.351193] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:34.901 NewBaseBdev 00:15:34.901 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:34.901 11:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:15:34.901 11:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:15:34.901 11:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:15:34.901 11:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:15:34.901 11:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:15:34.901 11:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:35.160 11:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:35.420 [ 00:15:35.420 { 00:15:35.420 "name": "NewBaseBdev", 00:15:35.420 "aliases": [ 00:15:35.420 "962e84f9-2df9-4905-98a3-7143da6a7e7c" 00:15:35.420 ], 00:15:35.420 "product_name": "Malloc disk", 00:15:35.420 "block_size": 512, 00:15:35.420 "num_blocks": 65536, 00:15:35.420 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:35.420 "assigned_rate_limits": { 00:15:35.420 "rw_ios_per_sec": 0, 00:15:35.420 "rw_mbytes_per_sec": 0, 00:15:35.420 "r_mbytes_per_sec": 0, 00:15:35.420 "w_mbytes_per_sec": 0 00:15:35.420 }, 00:15:35.420 "claimed": true, 00:15:35.420 "claim_type": "exclusive_write", 00:15:35.420 "zoned": false, 00:15:35.420 "supported_io_types": { 00:15:35.420 "read": true, 00:15:35.420 "write": true, 00:15:35.420 "unmap": true, 00:15:35.420 "flush": true, 00:15:35.420 "reset": true, 00:15:35.420 "nvme_admin": false, 00:15:35.420 "nvme_io": false, 00:15:35.420 "nvme_io_md": false, 00:15:35.420 "write_zeroes": true, 00:15:35.420 "zcopy": true, 00:15:35.420 "get_zone_info": false, 00:15:35.420 "zone_management": false, 00:15:35.420 "zone_append": false, 00:15:35.420 "compare": false, 00:15:35.420 "compare_and_write": false, 00:15:35.420 "abort": true, 00:15:35.420 "seek_hole": false, 00:15:35.420 "seek_data": false, 00:15:35.420 "copy": true, 00:15:35.420 "nvme_iov_md": false 00:15:35.420 }, 00:15:35.420 "memory_domains": [ 00:15:35.420 { 00:15:35.420 "dma_device_id": "system", 00:15:35.420 "dma_device_type": 1 00:15:35.420 }, 00:15:35.420 { 00:15:35.420 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:35.420 "dma_device_type": 2 00:15:35.420 } 00:15:35.420 ], 00:15:35.420 "driver_specific": {} 00:15:35.420 } 00:15:35.420 ] 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.420 11:56:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:35.682 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:35.682 "name": "Existed_Raid", 00:15:35.682 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:35.682 "strip_size_kb": 64, 00:15:35.682 "state": "online", 00:15:35.682 "raid_level": "raid0", 00:15:35.682 "superblock": true, 00:15:35.682 "num_base_bdevs": 3, 00:15:35.682 "num_base_bdevs_discovered": 3, 00:15:35.682 "num_base_bdevs_operational": 3, 00:15:35.682 "base_bdevs_list": [ 00:15:35.682 { 00:15:35.682 "name": "NewBaseBdev", 00:15:35.682 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:35.682 "is_configured": true, 00:15:35.682 "data_offset": 2048, 00:15:35.682 "data_size": 63488 00:15:35.682 }, 00:15:35.682 { 00:15:35.682 "name": "BaseBdev2", 00:15:35.682 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:35.682 "is_configured": true, 00:15:35.682 "data_offset": 2048, 00:15:35.682 "data_size": 63488 00:15:35.682 }, 00:15:35.682 { 00:15:35.682 "name": "BaseBdev3", 00:15:35.682 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:35.682 "is_configured": true, 00:15:35.682 "data_offset": 2048, 00:15:35.682 "data_size": 63488 00:15:35.682 } 00:15:35.682 ] 00:15:35.682 }' 00:15:35.682 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:35.682 11:56:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:36.250 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:36.250 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:36.250 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:36.250 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:36.250 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:36.250 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:36.250 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:36.250 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:36.509 [2024-07-15 11:56:49.866946] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:36.509 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:36.509 "name": "Existed_Raid", 00:15:36.509 "aliases": [ 00:15:36.509 "0a5bebf8-210e-4451-bfd3-2d4741d65f37" 00:15:36.509 ], 00:15:36.509 "product_name": "Raid Volume", 00:15:36.509 "block_size": 512, 00:15:36.509 "num_blocks": 190464, 00:15:36.509 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:36.509 "assigned_rate_limits": { 00:15:36.509 "rw_ios_per_sec": 0, 00:15:36.509 "rw_mbytes_per_sec": 0, 00:15:36.509 "r_mbytes_per_sec": 0, 00:15:36.509 "w_mbytes_per_sec": 0 00:15:36.509 }, 00:15:36.509 "claimed": false, 00:15:36.509 "zoned": false, 00:15:36.509 "supported_io_types": { 00:15:36.509 "read": true, 00:15:36.509 "write": true, 00:15:36.509 "unmap": true, 00:15:36.509 "flush": true, 00:15:36.509 "reset": true, 00:15:36.509 "nvme_admin": false, 00:15:36.509 "nvme_io": false, 00:15:36.509 "nvme_io_md": false, 00:15:36.509 "write_zeroes": true, 00:15:36.509 "zcopy": false, 00:15:36.509 "get_zone_info": false, 00:15:36.509 "zone_management": false, 00:15:36.509 "zone_append": false, 00:15:36.509 "compare": false, 00:15:36.509 "compare_and_write": false, 00:15:36.509 "abort": false, 00:15:36.509 "seek_hole": false, 00:15:36.509 "seek_data": false, 00:15:36.509 "copy": false, 00:15:36.509 "nvme_iov_md": false 00:15:36.509 }, 00:15:36.509 "memory_domains": [ 00:15:36.509 { 00:15:36.509 "dma_device_id": "system", 00:15:36.509 "dma_device_type": 1 00:15:36.509 }, 00:15:36.509 { 00:15:36.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.509 "dma_device_type": 2 00:15:36.509 }, 00:15:36.509 { 00:15:36.509 "dma_device_id": "system", 00:15:36.509 "dma_device_type": 1 00:15:36.509 }, 00:15:36.509 { 00:15:36.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.509 "dma_device_type": 2 00:15:36.509 }, 00:15:36.509 { 00:15:36.510 "dma_device_id": "system", 00:15:36.510 "dma_device_type": 1 00:15:36.510 }, 00:15:36.510 { 00:15:36.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.510 "dma_device_type": 2 00:15:36.510 } 00:15:36.510 ], 00:15:36.510 "driver_specific": { 00:15:36.510 "raid": { 00:15:36.510 "uuid": "0a5bebf8-210e-4451-bfd3-2d4741d65f37", 00:15:36.510 "strip_size_kb": 64, 00:15:36.510 "state": "online", 00:15:36.510 "raid_level": "raid0", 00:15:36.510 "superblock": true, 00:15:36.510 "num_base_bdevs": 3, 00:15:36.510 "num_base_bdevs_discovered": 3, 00:15:36.510 "num_base_bdevs_operational": 3, 00:15:36.510 "base_bdevs_list": [ 00:15:36.510 { 00:15:36.510 "name": "NewBaseBdev", 00:15:36.510 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:36.510 "is_configured": true, 00:15:36.510 "data_offset": 2048, 00:15:36.510 "data_size": 63488 00:15:36.510 }, 00:15:36.510 { 00:15:36.510 "name": "BaseBdev2", 00:15:36.510 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:36.510 "is_configured": true, 00:15:36.510 "data_offset": 2048, 00:15:36.510 "data_size": 63488 00:15:36.510 }, 00:15:36.510 { 00:15:36.510 "name": "BaseBdev3", 00:15:36.510 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:36.510 "is_configured": true, 00:15:36.510 "data_offset": 2048, 00:15:36.510 "data_size": 63488 00:15:36.510 } 00:15:36.510 ] 00:15:36.510 } 00:15:36.510 } 00:15:36.510 }' 00:15:36.510 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:36.510 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:36.510 BaseBdev2 00:15:36.510 BaseBdev3' 00:15:36.510 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:36.510 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:36.510 11:56:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:36.770 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:36.770 "name": "NewBaseBdev", 00:15:36.770 "aliases": [ 00:15:36.770 "962e84f9-2df9-4905-98a3-7143da6a7e7c" 00:15:36.770 ], 00:15:36.770 "product_name": "Malloc disk", 00:15:36.770 "block_size": 512, 00:15:36.770 "num_blocks": 65536, 00:15:36.770 "uuid": "962e84f9-2df9-4905-98a3-7143da6a7e7c", 00:15:36.770 "assigned_rate_limits": { 00:15:36.770 "rw_ios_per_sec": 0, 00:15:36.770 "rw_mbytes_per_sec": 0, 00:15:36.770 "r_mbytes_per_sec": 0, 00:15:36.770 "w_mbytes_per_sec": 0 00:15:36.770 }, 00:15:36.770 "claimed": true, 00:15:36.770 "claim_type": "exclusive_write", 00:15:36.770 "zoned": false, 00:15:36.770 "supported_io_types": { 00:15:36.770 "read": true, 00:15:36.770 "write": true, 00:15:36.770 "unmap": true, 00:15:36.770 "flush": true, 00:15:36.770 "reset": true, 00:15:36.770 "nvme_admin": false, 00:15:36.770 "nvme_io": false, 00:15:36.770 "nvme_io_md": false, 00:15:36.770 "write_zeroes": true, 00:15:36.770 "zcopy": true, 00:15:36.770 "get_zone_info": false, 00:15:36.770 "zone_management": false, 00:15:36.770 "zone_append": false, 00:15:36.770 "compare": false, 00:15:36.770 "compare_and_write": false, 00:15:36.770 "abort": true, 00:15:36.770 "seek_hole": false, 00:15:36.770 "seek_data": false, 00:15:36.770 "copy": true, 00:15:36.770 "nvme_iov_md": false 00:15:36.770 }, 00:15:36.770 "memory_domains": [ 00:15:36.770 { 00:15:36.770 "dma_device_id": "system", 00:15:36.770 "dma_device_type": 1 00:15:36.770 }, 00:15:36.770 { 00:15:36.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:36.770 "dma_device_type": 2 00:15:36.770 } 00:15:36.770 ], 00:15:36.770 "driver_specific": {} 00:15:36.770 }' 00:15:36.770 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:36.770 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:36.770 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:36.770 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:36.770 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:36.770 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:36.770 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.030 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.030 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:37.030 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.030 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.030 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:37.030 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:37.030 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:37.030 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:37.289 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:37.289 "name": "BaseBdev2", 00:15:37.289 "aliases": [ 00:15:37.289 "d75b8261-fab5-470f-a3db-25c16ea24d34" 00:15:37.289 ], 00:15:37.289 "product_name": "Malloc disk", 00:15:37.289 "block_size": 512, 00:15:37.289 "num_blocks": 65536, 00:15:37.289 "uuid": "d75b8261-fab5-470f-a3db-25c16ea24d34", 00:15:37.289 "assigned_rate_limits": { 00:15:37.289 "rw_ios_per_sec": 0, 00:15:37.289 "rw_mbytes_per_sec": 0, 00:15:37.289 "r_mbytes_per_sec": 0, 00:15:37.289 "w_mbytes_per_sec": 0 00:15:37.289 }, 00:15:37.289 "claimed": true, 00:15:37.289 "claim_type": "exclusive_write", 00:15:37.289 "zoned": false, 00:15:37.289 "supported_io_types": { 00:15:37.289 "read": true, 00:15:37.289 "write": true, 00:15:37.289 "unmap": true, 00:15:37.289 "flush": true, 00:15:37.289 "reset": true, 00:15:37.289 "nvme_admin": false, 00:15:37.289 "nvme_io": false, 00:15:37.289 "nvme_io_md": false, 00:15:37.289 "write_zeroes": true, 00:15:37.289 "zcopy": true, 00:15:37.289 "get_zone_info": false, 00:15:37.289 "zone_management": false, 00:15:37.289 "zone_append": false, 00:15:37.289 "compare": false, 00:15:37.289 "compare_and_write": false, 00:15:37.289 "abort": true, 00:15:37.289 "seek_hole": false, 00:15:37.289 "seek_data": false, 00:15:37.289 "copy": true, 00:15:37.289 "nvme_iov_md": false 00:15:37.289 }, 00:15:37.289 "memory_domains": [ 00:15:37.289 { 00:15:37.289 "dma_device_id": "system", 00:15:37.289 "dma_device_type": 1 00:15:37.289 }, 00:15:37.289 { 00:15:37.289 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.289 "dma_device_type": 2 00:15:37.289 } 00:15:37.289 ], 00:15:37.289 "driver_specific": {} 00:15:37.289 }' 00:15:37.289 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.289 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.289 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:37.289 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.548 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.548 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:37.548 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.548 11:56:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.548 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:37.548 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.548 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.548 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:37.548 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:37.548 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:37.548 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:37.807 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:37.807 "name": "BaseBdev3", 00:15:37.807 "aliases": [ 00:15:37.807 "3de28c80-64c2-44ca-9503-327095903339" 00:15:37.807 ], 00:15:37.807 "product_name": "Malloc disk", 00:15:37.807 "block_size": 512, 00:15:37.807 "num_blocks": 65536, 00:15:37.807 "uuid": "3de28c80-64c2-44ca-9503-327095903339", 00:15:37.807 "assigned_rate_limits": { 00:15:37.807 "rw_ios_per_sec": 0, 00:15:37.807 "rw_mbytes_per_sec": 0, 00:15:37.807 "r_mbytes_per_sec": 0, 00:15:37.807 "w_mbytes_per_sec": 0 00:15:37.807 }, 00:15:37.807 "claimed": true, 00:15:37.807 "claim_type": "exclusive_write", 00:15:37.807 "zoned": false, 00:15:37.807 "supported_io_types": { 00:15:37.807 "read": true, 00:15:37.807 "write": true, 00:15:37.807 "unmap": true, 00:15:37.807 "flush": true, 00:15:37.807 "reset": true, 00:15:37.807 "nvme_admin": false, 00:15:37.807 "nvme_io": false, 00:15:37.807 "nvme_io_md": false, 00:15:37.807 "write_zeroes": true, 00:15:37.807 "zcopy": true, 00:15:37.807 "get_zone_info": false, 00:15:37.807 "zone_management": false, 00:15:37.807 "zone_append": false, 00:15:37.807 "compare": false, 00:15:37.807 "compare_and_write": false, 00:15:37.807 "abort": true, 00:15:37.807 "seek_hole": false, 00:15:37.807 "seek_data": false, 00:15:37.807 "copy": true, 00:15:37.807 "nvme_iov_md": false 00:15:37.807 }, 00:15:37.807 "memory_domains": [ 00:15:37.807 { 00:15:37.807 "dma_device_id": "system", 00:15:37.807 "dma_device_type": 1 00:15:37.807 }, 00:15:37.807 { 00:15:37.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.807 "dma_device_type": 2 00:15:37.807 } 00:15:37.807 ], 00:15:37.807 "driver_specific": {} 00:15:37.807 }' 00:15:37.807 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.067 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.067 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.067 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.067 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.067 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:38.067 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.067 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.067 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:38.067 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.326 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.326 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:38.326 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:38.586 [2024-07-15 11:56:51.944193] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:38.586 [2024-07-15 11:56:51.944220] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:38.586 [2024-07-15 11:56:51.944274] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:38.586 [2024-07-15 11:56:51.944326] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:38.586 [2024-07-15 11:56:51.944337] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1385c10 name Existed_Raid, state offline 00:15:38.586 11:56:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1480339 00:15:38.586 11:56:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1480339 ']' 00:15:38.586 11:56:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1480339 00:15:38.586 11:56:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:15:38.586 11:56:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:38.586 11:56:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1480339 00:15:38.586 11:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:38.586 11:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:38.586 11:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1480339' 00:15:38.586 killing process with pid 1480339 00:15:38.586 11:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1480339 00:15:38.586 [2024-07-15 11:56:52.010165] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:38.586 11:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1480339 00:15:38.586 [2024-07-15 11:56:52.039945] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:38.845 11:56:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:38.845 00:15:38.845 real 0m31.667s 00:15:38.845 user 0m58.753s 00:15:38.845 sys 0m5.604s 00:15:38.845 11:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:38.845 11:56:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:38.845 ************************************ 00:15:38.845 END TEST raid_state_function_test_sb 00:15:38.845 ************************************ 00:15:38.845 11:56:52 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:38.845 11:56:52 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:15:38.845 11:56:52 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:15:38.845 11:56:52 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:38.845 11:56:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:38.845 ************************************ 00:15:38.845 START TEST raid_superblock_test 00:15:38.845 ************************************ 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 3 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1484990 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1484990 /var/tmp/spdk-raid.sock 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1484990 ']' 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:38.845 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:38.845 11:56:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:38.845 [2024-07-15 11:56:52.405146] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:15:38.845 [2024-07-15 11:56:52.405209] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1484990 ] 00:15:39.105 [2024-07-15 11:56:52.545070] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:39.105 [2024-07-15 11:56:52.679663] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:39.364 [2024-07-15 11:56:52.746481] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:39.364 [2024-07-15 11:56:52.746520] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:39.932 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:40.190 malloc1 00:15:40.190 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:40.449 [2024-07-15 11:56:53.909718] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:40.449 [2024-07-15 11:56:53.909770] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.449 [2024-07-15 11:56:53.909791] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d8a560 00:15:40.449 [2024-07-15 11:56:53.909803] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.449 [2024-07-15 11:56:53.911441] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.449 [2024-07-15 11:56:53.911469] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:40.449 pt1 00:15:40.449 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:40.449 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:40.449 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:40.449 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:40.449 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:40.449 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:40.449 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:40.449 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:40.449 11:56:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:40.708 malloc2 00:15:40.708 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:40.967 [2024-07-15 11:56:54.403830] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:40.967 [2024-07-15 11:56:54.403881] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.967 [2024-07-15 11:56:54.403904] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e285b0 00:15:40.967 [2024-07-15 11:56:54.403917] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.967 [2024-07-15 11:56:54.405507] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.967 [2024-07-15 11:56:54.405535] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:40.967 pt2 00:15:40.967 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:40.967 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:40.967 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:40.967 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:40.967 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:40.967 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:40.967 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:40.967 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:40.967 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:41.226 malloc3 00:15:41.226 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:41.485 [2024-07-15 11:56:54.890968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:41.485 [2024-07-15 11:56:54.891017] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:41.485 [2024-07-15 11:56:54.891034] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e28be0 00:15:41.485 [2024-07-15 11:56:54.891047] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:41.485 [2024-07-15 11:56:54.892570] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:41.485 [2024-07-15 11:56:54.892597] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:41.485 pt3 00:15:41.485 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:41.485 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:41.485 11:56:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:41.744 [2024-07-15 11:56:55.135643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:41.744 [2024-07-15 11:56:55.137007] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:41.744 [2024-07-15 11:56:55.137064] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:41.744 [2024-07-15 11:56:55.137211] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e29510 00:15:41.744 [2024-07-15 11:56:55.137223] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:41.744 [2024-07-15 11:56:55.137422] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d89180 00:15:41.744 [2024-07-15 11:56:55.137563] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e29510 00:15:41.744 [2024-07-15 11:56:55.137574] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e29510 00:15:41.744 [2024-07-15 11:56:55.137672] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.744 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:42.004 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:42.004 "name": "raid_bdev1", 00:15:42.004 "uuid": "96cc13ac-c03a-4291-b7b0-f684025fd116", 00:15:42.004 "strip_size_kb": 64, 00:15:42.004 "state": "online", 00:15:42.004 "raid_level": "raid0", 00:15:42.004 "superblock": true, 00:15:42.004 "num_base_bdevs": 3, 00:15:42.004 "num_base_bdevs_discovered": 3, 00:15:42.004 "num_base_bdevs_operational": 3, 00:15:42.004 "base_bdevs_list": [ 00:15:42.004 { 00:15:42.004 "name": "pt1", 00:15:42.004 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:42.004 "is_configured": true, 00:15:42.004 "data_offset": 2048, 00:15:42.004 "data_size": 63488 00:15:42.004 }, 00:15:42.004 { 00:15:42.004 "name": "pt2", 00:15:42.004 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:42.004 "is_configured": true, 00:15:42.004 "data_offset": 2048, 00:15:42.004 "data_size": 63488 00:15:42.004 }, 00:15:42.004 { 00:15:42.004 "name": "pt3", 00:15:42.004 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:42.004 "is_configured": true, 00:15:42.004 "data_offset": 2048, 00:15:42.004 "data_size": 63488 00:15:42.004 } 00:15:42.004 ] 00:15:42.004 }' 00:15:42.004 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:42.004 11:56:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.573 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:42.573 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:42.573 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:42.573 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:42.573 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:42.573 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:42.573 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:42.573 11:56:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:42.573 [2024-07-15 11:56:56.106449] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:42.573 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:42.573 "name": "raid_bdev1", 00:15:42.573 "aliases": [ 00:15:42.573 "96cc13ac-c03a-4291-b7b0-f684025fd116" 00:15:42.573 ], 00:15:42.573 "product_name": "Raid Volume", 00:15:42.573 "block_size": 512, 00:15:42.573 "num_blocks": 190464, 00:15:42.573 "uuid": "96cc13ac-c03a-4291-b7b0-f684025fd116", 00:15:42.573 "assigned_rate_limits": { 00:15:42.573 "rw_ios_per_sec": 0, 00:15:42.573 "rw_mbytes_per_sec": 0, 00:15:42.573 "r_mbytes_per_sec": 0, 00:15:42.573 "w_mbytes_per_sec": 0 00:15:42.573 }, 00:15:42.573 "claimed": false, 00:15:42.573 "zoned": false, 00:15:42.573 "supported_io_types": { 00:15:42.573 "read": true, 00:15:42.573 "write": true, 00:15:42.573 "unmap": true, 00:15:42.573 "flush": true, 00:15:42.573 "reset": true, 00:15:42.573 "nvme_admin": false, 00:15:42.573 "nvme_io": false, 00:15:42.573 "nvme_io_md": false, 00:15:42.573 "write_zeroes": true, 00:15:42.573 "zcopy": false, 00:15:42.573 "get_zone_info": false, 00:15:42.574 "zone_management": false, 00:15:42.574 "zone_append": false, 00:15:42.574 "compare": false, 00:15:42.574 "compare_and_write": false, 00:15:42.574 "abort": false, 00:15:42.574 "seek_hole": false, 00:15:42.574 "seek_data": false, 00:15:42.574 "copy": false, 00:15:42.574 "nvme_iov_md": false 00:15:42.574 }, 00:15:42.574 "memory_domains": [ 00:15:42.574 { 00:15:42.574 "dma_device_id": "system", 00:15:42.574 "dma_device_type": 1 00:15:42.574 }, 00:15:42.574 { 00:15:42.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.574 "dma_device_type": 2 00:15:42.574 }, 00:15:42.574 { 00:15:42.574 "dma_device_id": "system", 00:15:42.574 "dma_device_type": 1 00:15:42.574 }, 00:15:42.574 { 00:15:42.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.574 "dma_device_type": 2 00:15:42.574 }, 00:15:42.574 { 00:15:42.574 "dma_device_id": "system", 00:15:42.574 "dma_device_type": 1 00:15:42.574 }, 00:15:42.574 { 00:15:42.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.574 "dma_device_type": 2 00:15:42.574 } 00:15:42.574 ], 00:15:42.574 "driver_specific": { 00:15:42.574 "raid": { 00:15:42.574 "uuid": "96cc13ac-c03a-4291-b7b0-f684025fd116", 00:15:42.574 "strip_size_kb": 64, 00:15:42.574 "state": "online", 00:15:42.574 "raid_level": "raid0", 00:15:42.574 "superblock": true, 00:15:42.574 "num_base_bdevs": 3, 00:15:42.574 "num_base_bdevs_discovered": 3, 00:15:42.574 "num_base_bdevs_operational": 3, 00:15:42.574 "base_bdevs_list": [ 00:15:42.574 { 00:15:42.574 "name": "pt1", 00:15:42.574 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:42.574 "is_configured": true, 00:15:42.574 "data_offset": 2048, 00:15:42.574 "data_size": 63488 00:15:42.574 }, 00:15:42.574 { 00:15:42.574 "name": "pt2", 00:15:42.574 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:42.574 "is_configured": true, 00:15:42.574 "data_offset": 2048, 00:15:42.574 "data_size": 63488 00:15:42.574 }, 00:15:42.574 { 00:15:42.574 "name": "pt3", 00:15:42.574 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:42.574 "is_configured": true, 00:15:42.574 "data_offset": 2048, 00:15:42.574 "data_size": 63488 00:15:42.574 } 00:15:42.574 ] 00:15:42.574 } 00:15:42.574 } 00:15:42.574 }' 00:15:42.574 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:42.834 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:42.834 pt2 00:15:42.834 pt3' 00:15:42.834 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:42.834 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:42.834 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:42.834 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:42.834 "name": "pt1", 00:15:42.834 "aliases": [ 00:15:42.834 "00000000-0000-0000-0000-000000000001" 00:15:42.834 ], 00:15:42.834 "product_name": "passthru", 00:15:42.834 "block_size": 512, 00:15:42.834 "num_blocks": 65536, 00:15:42.834 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:42.834 "assigned_rate_limits": { 00:15:42.834 "rw_ios_per_sec": 0, 00:15:42.834 "rw_mbytes_per_sec": 0, 00:15:42.834 "r_mbytes_per_sec": 0, 00:15:42.834 "w_mbytes_per_sec": 0 00:15:42.834 }, 00:15:42.834 "claimed": true, 00:15:42.834 "claim_type": "exclusive_write", 00:15:42.834 "zoned": false, 00:15:42.834 "supported_io_types": { 00:15:42.834 "read": true, 00:15:42.834 "write": true, 00:15:42.834 "unmap": true, 00:15:42.834 "flush": true, 00:15:42.834 "reset": true, 00:15:42.834 "nvme_admin": false, 00:15:42.834 "nvme_io": false, 00:15:42.834 "nvme_io_md": false, 00:15:42.834 "write_zeroes": true, 00:15:42.834 "zcopy": true, 00:15:42.834 "get_zone_info": false, 00:15:42.834 "zone_management": false, 00:15:42.834 "zone_append": false, 00:15:42.834 "compare": false, 00:15:42.834 "compare_and_write": false, 00:15:42.834 "abort": true, 00:15:42.834 "seek_hole": false, 00:15:42.834 "seek_data": false, 00:15:42.834 "copy": true, 00:15:42.834 "nvme_iov_md": false 00:15:42.834 }, 00:15:42.834 "memory_domains": [ 00:15:42.834 { 00:15:42.834 "dma_device_id": "system", 00:15:42.834 "dma_device_type": 1 00:15:42.834 }, 00:15:42.834 { 00:15:42.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.834 "dma_device_type": 2 00:15:42.834 } 00:15:42.834 ], 00:15:42.834 "driver_specific": { 00:15:42.834 "passthru": { 00:15:42.834 "name": "pt1", 00:15:42.834 "base_bdev_name": "malloc1" 00:15:42.834 } 00:15:42.834 } 00:15:42.834 }' 00:15:42.834 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.094 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.094 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.094 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.094 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.094 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.094 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.094 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.370 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.370 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.370 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.370 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.370 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:43.370 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:43.370 11:56:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:43.673 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:43.673 "name": "pt2", 00:15:43.673 "aliases": [ 00:15:43.673 "00000000-0000-0000-0000-000000000002" 00:15:43.673 ], 00:15:43.673 "product_name": "passthru", 00:15:43.673 "block_size": 512, 00:15:43.673 "num_blocks": 65536, 00:15:43.673 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:43.673 "assigned_rate_limits": { 00:15:43.673 "rw_ios_per_sec": 0, 00:15:43.673 "rw_mbytes_per_sec": 0, 00:15:43.673 "r_mbytes_per_sec": 0, 00:15:43.673 "w_mbytes_per_sec": 0 00:15:43.673 }, 00:15:43.673 "claimed": true, 00:15:43.673 "claim_type": "exclusive_write", 00:15:43.673 "zoned": false, 00:15:43.673 "supported_io_types": { 00:15:43.673 "read": true, 00:15:43.673 "write": true, 00:15:43.673 "unmap": true, 00:15:43.673 "flush": true, 00:15:43.673 "reset": true, 00:15:43.673 "nvme_admin": false, 00:15:43.673 "nvme_io": false, 00:15:43.673 "nvme_io_md": false, 00:15:43.673 "write_zeroes": true, 00:15:43.673 "zcopy": true, 00:15:43.673 "get_zone_info": false, 00:15:43.673 "zone_management": false, 00:15:43.673 "zone_append": false, 00:15:43.673 "compare": false, 00:15:43.673 "compare_and_write": false, 00:15:43.673 "abort": true, 00:15:43.673 "seek_hole": false, 00:15:43.673 "seek_data": false, 00:15:43.673 "copy": true, 00:15:43.673 "nvme_iov_md": false 00:15:43.673 }, 00:15:43.673 "memory_domains": [ 00:15:43.673 { 00:15:43.673 "dma_device_id": "system", 00:15:43.673 "dma_device_type": 1 00:15:43.673 }, 00:15:43.673 { 00:15:43.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.673 "dma_device_type": 2 00:15:43.673 } 00:15:43.673 ], 00:15:43.673 "driver_specific": { 00:15:43.673 "passthru": { 00:15:43.673 "name": "pt2", 00:15:43.673 "base_bdev_name": "malloc2" 00:15:43.673 } 00:15:43.673 } 00:15:43.673 }' 00:15:43.673 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.673 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.673 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.673 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.673 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.673 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.673 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.673 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.931 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.931 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.931 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.931 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.931 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:43.931 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:43.931 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:44.190 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:44.190 "name": "pt3", 00:15:44.190 "aliases": [ 00:15:44.190 "00000000-0000-0000-0000-000000000003" 00:15:44.190 ], 00:15:44.190 "product_name": "passthru", 00:15:44.190 "block_size": 512, 00:15:44.190 "num_blocks": 65536, 00:15:44.190 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:44.190 "assigned_rate_limits": { 00:15:44.190 "rw_ios_per_sec": 0, 00:15:44.190 "rw_mbytes_per_sec": 0, 00:15:44.190 "r_mbytes_per_sec": 0, 00:15:44.190 "w_mbytes_per_sec": 0 00:15:44.190 }, 00:15:44.190 "claimed": true, 00:15:44.190 "claim_type": "exclusive_write", 00:15:44.190 "zoned": false, 00:15:44.190 "supported_io_types": { 00:15:44.190 "read": true, 00:15:44.190 "write": true, 00:15:44.190 "unmap": true, 00:15:44.190 "flush": true, 00:15:44.190 "reset": true, 00:15:44.190 "nvme_admin": false, 00:15:44.190 "nvme_io": false, 00:15:44.190 "nvme_io_md": false, 00:15:44.190 "write_zeroes": true, 00:15:44.190 "zcopy": true, 00:15:44.190 "get_zone_info": false, 00:15:44.190 "zone_management": false, 00:15:44.190 "zone_append": false, 00:15:44.190 "compare": false, 00:15:44.190 "compare_and_write": false, 00:15:44.190 "abort": true, 00:15:44.190 "seek_hole": false, 00:15:44.190 "seek_data": false, 00:15:44.190 "copy": true, 00:15:44.190 "nvme_iov_md": false 00:15:44.190 }, 00:15:44.190 "memory_domains": [ 00:15:44.190 { 00:15:44.190 "dma_device_id": "system", 00:15:44.190 "dma_device_type": 1 00:15:44.190 }, 00:15:44.190 { 00:15:44.190 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.190 "dma_device_type": 2 00:15:44.190 } 00:15:44.190 ], 00:15:44.190 "driver_specific": { 00:15:44.190 "passthru": { 00:15:44.190 "name": "pt3", 00:15:44.190 "base_bdev_name": "malloc3" 00:15:44.190 } 00:15:44.190 } 00:15:44.190 }' 00:15:44.190 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.190 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.190 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:44.190 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.190 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.450 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:44.450 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.450 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.450 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:44.450 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.450 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.450 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:44.450 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:44.450 11:56:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:44.709 [2024-07-15 11:56:58.183946] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:44.709 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=96cc13ac-c03a-4291-b7b0-f684025fd116 00:15:44.709 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 96cc13ac-c03a-4291-b7b0-f684025fd116 ']' 00:15:44.709 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:44.968 [2024-07-15 11:56:58.432331] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:44.968 [2024-07-15 11:56:58.432357] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:44.968 [2024-07-15 11:56:58.432407] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:44.968 [2024-07-15 11:56:58.432461] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:44.968 [2024-07-15 11:56:58.432472] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e29510 name raid_bdev1, state offline 00:15:44.968 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:44.968 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:45.227 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:45.227 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:45.227 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:45.227 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:45.486 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:45.486 11:56:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:45.745 11:56:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:45.745 11:56:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:46.005 11:56:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:46.005 11:56:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:46.264 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:46.524 [2024-07-15 11:56:59.900246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:46.524 [2024-07-15 11:56:59.901623] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:46.524 [2024-07-15 11:56:59.901666] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:46.524 [2024-07-15 11:56:59.901719] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:46.524 [2024-07-15 11:56:59.901759] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:46.524 [2024-07-15 11:56:59.901782] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:46.524 [2024-07-15 11:56:59.901801] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:46.524 [2024-07-15 11:56:59.901810] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e2c500 name raid_bdev1, state configuring 00:15:46.524 request: 00:15:46.524 { 00:15:46.524 "name": "raid_bdev1", 00:15:46.524 "raid_level": "raid0", 00:15:46.524 "base_bdevs": [ 00:15:46.524 "malloc1", 00:15:46.524 "malloc2", 00:15:46.524 "malloc3" 00:15:46.524 ], 00:15:46.524 "strip_size_kb": 64, 00:15:46.524 "superblock": false, 00:15:46.524 "method": "bdev_raid_create", 00:15:46.524 "req_id": 1 00:15:46.524 } 00:15:46.524 Got JSON-RPC error response 00:15:46.524 response: 00:15:46.524 { 00:15:46.524 "code": -17, 00:15:46.524 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:46.524 } 00:15:46.524 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:15:46.524 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:15:46.524 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:15:46.524 11:56:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:15:46.524 11:56:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:46.524 11:56:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:46.783 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:46.783 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:46.783 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:47.042 [2024-07-15 11:57:00.397497] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:47.042 [2024-07-15 11:57:00.397546] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.042 [2024-07-15 11:57:00.397564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e2a210 00:15:47.042 [2024-07-15 11:57:00.397577] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.042 [2024-07-15 11:57:00.399171] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.042 [2024-07-15 11:57:00.399199] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:47.042 [2024-07-15 11:57:00.399264] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:47.042 [2024-07-15 11:57:00.399288] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:47.042 pt1 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:47.042 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:47.302 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:47.302 "name": "raid_bdev1", 00:15:47.302 "uuid": "96cc13ac-c03a-4291-b7b0-f684025fd116", 00:15:47.302 "strip_size_kb": 64, 00:15:47.302 "state": "configuring", 00:15:47.302 "raid_level": "raid0", 00:15:47.302 "superblock": true, 00:15:47.302 "num_base_bdevs": 3, 00:15:47.302 "num_base_bdevs_discovered": 1, 00:15:47.302 "num_base_bdevs_operational": 3, 00:15:47.302 "base_bdevs_list": [ 00:15:47.302 { 00:15:47.302 "name": "pt1", 00:15:47.302 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:47.302 "is_configured": true, 00:15:47.302 "data_offset": 2048, 00:15:47.302 "data_size": 63488 00:15:47.302 }, 00:15:47.302 { 00:15:47.302 "name": null, 00:15:47.302 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:47.302 "is_configured": false, 00:15:47.302 "data_offset": 2048, 00:15:47.302 "data_size": 63488 00:15:47.302 }, 00:15:47.302 { 00:15:47.302 "name": null, 00:15:47.302 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:47.302 "is_configured": false, 00:15:47.302 "data_offset": 2048, 00:15:47.302 "data_size": 63488 00:15:47.302 } 00:15:47.302 ] 00:15:47.302 }' 00:15:47.302 11:57:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:47.302 11:57:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:47.871 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:47.871 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:48.131 [2024-07-15 11:57:01.524504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:48.131 [2024-07-15 11:57:01.524559] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:48.131 [2024-07-15 11:57:01.524577] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e2c500 00:15:48.131 [2024-07-15 11:57:01.524590] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:48.131 [2024-07-15 11:57:01.524950] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:48.131 [2024-07-15 11:57:01.524969] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:48.131 [2024-07-15 11:57:01.525034] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:48.131 [2024-07-15 11:57:01.525052] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:48.131 pt2 00:15:48.131 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:48.390 [2024-07-15 11:57:01.757136] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.390 11:57:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:48.649 11:57:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.649 "name": "raid_bdev1", 00:15:48.649 "uuid": "96cc13ac-c03a-4291-b7b0-f684025fd116", 00:15:48.649 "strip_size_kb": 64, 00:15:48.649 "state": "configuring", 00:15:48.649 "raid_level": "raid0", 00:15:48.649 "superblock": true, 00:15:48.649 "num_base_bdevs": 3, 00:15:48.649 "num_base_bdevs_discovered": 1, 00:15:48.649 "num_base_bdevs_operational": 3, 00:15:48.649 "base_bdevs_list": [ 00:15:48.649 { 00:15:48.649 "name": "pt1", 00:15:48.649 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:48.649 "is_configured": true, 00:15:48.649 "data_offset": 2048, 00:15:48.649 "data_size": 63488 00:15:48.649 }, 00:15:48.649 { 00:15:48.649 "name": null, 00:15:48.649 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:48.649 "is_configured": false, 00:15:48.649 "data_offset": 2048, 00:15:48.649 "data_size": 63488 00:15:48.649 }, 00:15:48.649 { 00:15:48.649 "name": null, 00:15:48.649 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:48.649 "is_configured": false, 00:15:48.649 "data_offset": 2048, 00:15:48.649 "data_size": 63488 00:15:48.649 } 00:15:48.649 ] 00:15:48.649 }' 00:15:48.649 11:57:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.649 11:57:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.216 11:57:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:49.216 11:57:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:49.216 11:57:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:49.475 [2024-07-15 11:57:02.860046] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:49.475 [2024-07-15 11:57:02.860099] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:49.475 [2024-07-15 11:57:02.860122] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d8c9c0 00:15:49.475 [2024-07-15 11:57:02.860134] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:49.475 [2024-07-15 11:57:02.860471] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:49.475 [2024-07-15 11:57:02.860487] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:49.475 [2024-07-15 11:57:02.860547] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:49.475 [2024-07-15 11:57:02.860566] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:49.475 pt2 00:15:49.475 11:57:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:49.475 11:57:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:49.475 11:57:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:49.734 [2024-07-15 11:57:03.108712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:49.734 [2024-07-15 11:57:03.108750] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:49.734 [2024-07-15 11:57:03.108768] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d8d050 00:15:49.734 [2024-07-15 11:57:03.108780] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:49.734 [2024-07-15 11:57:03.109075] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:49.734 [2024-07-15 11:57:03.109092] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:49.734 [2024-07-15 11:57:03.109142] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:49.734 [2024-07-15 11:57:03.109160] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:49.734 [2024-07-15 11:57:03.109262] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e2b030 00:15:49.734 [2024-07-15 11:57:03.109272] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:49.734 [2024-07-15 11:57:03.109437] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d89c10 00:15:49.734 [2024-07-15 11:57:03.109559] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e2b030 00:15:49.734 [2024-07-15 11:57:03.109569] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e2b030 00:15:49.734 [2024-07-15 11:57:03.109664] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:49.734 pt3 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:49.734 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:49.993 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:49.993 "name": "raid_bdev1", 00:15:49.993 "uuid": "96cc13ac-c03a-4291-b7b0-f684025fd116", 00:15:49.993 "strip_size_kb": 64, 00:15:49.993 "state": "online", 00:15:49.993 "raid_level": "raid0", 00:15:49.993 "superblock": true, 00:15:49.993 "num_base_bdevs": 3, 00:15:49.993 "num_base_bdevs_discovered": 3, 00:15:49.993 "num_base_bdevs_operational": 3, 00:15:49.993 "base_bdevs_list": [ 00:15:49.993 { 00:15:49.993 "name": "pt1", 00:15:49.993 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:49.993 "is_configured": true, 00:15:49.993 "data_offset": 2048, 00:15:49.993 "data_size": 63488 00:15:49.993 }, 00:15:49.993 { 00:15:49.993 "name": "pt2", 00:15:49.993 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:49.993 "is_configured": true, 00:15:49.993 "data_offset": 2048, 00:15:49.993 "data_size": 63488 00:15:49.993 }, 00:15:49.993 { 00:15:49.993 "name": "pt3", 00:15:49.993 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:49.993 "is_configured": true, 00:15:49.993 "data_offset": 2048, 00:15:49.993 "data_size": 63488 00:15:49.993 } 00:15:49.993 ] 00:15:49.993 }' 00:15:49.993 11:57:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:49.993 11:57:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:50.561 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:50.561 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:50.561 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:50.561 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:50.561 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:50.561 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:50.561 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:50.561 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:50.820 [2024-07-15 11:57:04.244000] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:50.820 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:50.820 "name": "raid_bdev1", 00:15:50.820 "aliases": [ 00:15:50.820 "96cc13ac-c03a-4291-b7b0-f684025fd116" 00:15:50.820 ], 00:15:50.820 "product_name": "Raid Volume", 00:15:50.820 "block_size": 512, 00:15:50.820 "num_blocks": 190464, 00:15:50.820 "uuid": "96cc13ac-c03a-4291-b7b0-f684025fd116", 00:15:50.820 "assigned_rate_limits": { 00:15:50.820 "rw_ios_per_sec": 0, 00:15:50.820 "rw_mbytes_per_sec": 0, 00:15:50.820 "r_mbytes_per_sec": 0, 00:15:50.820 "w_mbytes_per_sec": 0 00:15:50.820 }, 00:15:50.820 "claimed": false, 00:15:50.820 "zoned": false, 00:15:50.820 "supported_io_types": { 00:15:50.820 "read": true, 00:15:50.820 "write": true, 00:15:50.820 "unmap": true, 00:15:50.820 "flush": true, 00:15:50.820 "reset": true, 00:15:50.820 "nvme_admin": false, 00:15:50.820 "nvme_io": false, 00:15:50.820 "nvme_io_md": false, 00:15:50.820 "write_zeroes": true, 00:15:50.820 "zcopy": false, 00:15:50.820 "get_zone_info": false, 00:15:50.820 "zone_management": false, 00:15:50.820 "zone_append": false, 00:15:50.820 "compare": false, 00:15:50.820 "compare_and_write": false, 00:15:50.820 "abort": false, 00:15:50.820 "seek_hole": false, 00:15:50.820 "seek_data": false, 00:15:50.820 "copy": false, 00:15:50.820 "nvme_iov_md": false 00:15:50.820 }, 00:15:50.820 "memory_domains": [ 00:15:50.820 { 00:15:50.820 "dma_device_id": "system", 00:15:50.820 "dma_device_type": 1 00:15:50.820 }, 00:15:50.820 { 00:15:50.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.820 "dma_device_type": 2 00:15:50.820 }, 00:15:50.820 { 00:15:50.820 "dma_device_id": "system", 00:15:50.820 "dma_device_type": 1 00:15:50.820 }, 00:15:50.820 { 00:15:50.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.820 "dma_device_type": 2 00:15:50.820 }, 00:15:50.820 { 00:15:50.820 "dma_device_id": "system", 00:15:50.820 "dma_device_type": 1 00:15:50.820 }, 00:15:50.820 { 00:15:50.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.820 "dma_device_type": 2 00:15:50.820 } 00:15:50.820 ], 00:15:50.820 "driver_specific": { 00:15:50.820 "raid": { 00:15:50.820 "uuid": "96cc13ac-c03a-4291-b7b0-f684025fd116", 00:15:50.820 "strip_size_kb": 64, 00:15:50.820 "state": "online", 00:15:50.820 "raid_level": "raid0", 00:15:50.820 "superblock": true, 00:15:50.820 "num_base_bdevs": 3, 00:15:50.820 "num_base_bdevs_discovered": 3, 00:15:50.820 "num_base_bdevs_operational": 3, 00:15:50.820 "base_bdevs_list": [ 00:15:50.820 { 00:15:50.820 "name": "pt1", 00:15:50.820 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:50.820 "is_configured": true, 00:15:50.820 "data_offset": 2048, 00:15:50.820 "data_size": 63488 00:15:50.820 }, 00:15:50.820 { 00:15:50.820 "name": "pt2", 00:15:50.820 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:50.820 "is_configured": true, 00:15:50.820 "data_offset": 2048, 00:15:50.820 "data_size": 63488 00:15:50.820 }, 00:15:50.820 { 00:15:50.820 "name": "pt3", 00:15:50.820 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:50.820 "is_configured": true, 00:15:50.820 "data_offset": 2048, 00:15:50.820 "data_size": 63488 00:15:50.820 } 00:15:50.820 ] 00:15:50.820 } 00:15:50.820 } 00:15:50.820 }' 00:15:50.820 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:50.820 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:50.820 pt2 00:15:50.820 pt3' 00:15:50.820 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:50.820 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:50.820 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.079 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.079 "name": "pt1", 00:15:51.079 "aliases": [ 00:15:51.079 "00000000-0000-0000-0000-000000000001" 00:15:51.079 ], 00:15:51.079 "product_name": "passthru", 00:15:51.079 "block_size": 512, 00:15:51.079 "num_blocks": 65536, 00:15:51.079 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:51.079 "assigned_rate_limits": { 00:15:51.079 "rw_ios_per_sec": 0, 00:15:51.079 "rw_mbytes_per_sec": 0, 00:15:51.079 "r_mbytes_per_sec": 0, 00:15:51.079 "w_mbytes_per_sec": 0 00:15:51.079 }, 00:15:51.079 "claimed": true, 00:15:51.079 "claim_type": "exclusive_write", 00:15:51.079 "zoned": false, 00:15:51.079 "supported_io_types": { 00:15:51.079 "read": true, 00:15:51.079 "write": true, 00:15:51.079 "unmap": true, 00:15:51.079 "flush": true, 00:15:51.079 "reset": true, 00:15:51.079 "nvme_admin": false, 00:15:51.079 "nvme_io": false, 00:15:51.079 "nvme_io_md": false, 00:15:51.079 "write_zeroes": true, 00:15:51.079 "zcopy": true, 00:15:51.079 "get_zone_info": false, 00:15:51.079 "zone_management": false, 00:15:51.079 "zone_append": false, 00:15:51.079 "compare": false, 00:15:51.079 "compare_and_write": false, 00:15:51.079 "abort": true, 00:15:51.079 "seek_hole": false, 00:15:51.079 "seek_data": false, 00:15:51.079 "copy": true, 00:15:51.079 "nvme_iov_md": false 00:15:51.079 }, 00:15:51.079 "memory_domains": [ 00:15:51.079 { 00:15:51.079 "dma_device_id": "system", 00:15:51.079 "dma_device_type": 1 00:15:51.079 }, 00:15:51.079 { 00:15:51.079 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.079 "dma_device_type": 2 00:15:51.079 } 00:15:51.079 ], 00:15:51.079 "driver_specific": { 00:15:51.079 "passthru": { 00:15:51.079 "name": "pt1", 00:15:51.079 "base_bdev_name": "malloc1" 00:15:51.079 } 00:15:51.079 } 00:15:51.079 }' 00:15:51.079 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.079 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.079 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.079 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:51.338 11:57:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.597 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.597 "name": "pt2", 00:15:51.597 "aliases": [ 00:15:51.597 "00000000-0000-0000-0000-000000000002" 00:15:51.597 ], 00:15:51.597 "product_name": "passthru", 00:15:51.597 "block_size": 512, 00:15:51.597 "num_blocks": 65536, 00:15:51.597 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:51.597 "assigned_rate_limits": { 00:15:51.597 "rw_ios_per_sec": 0, 00:15:51.597 "rw_mbytes_per_sec": 0, 00:15:51.597 "r_mbytes_per_sec": 0, 00:15:51.597 "w_mbytes_per_sec": 0 00:15:51.597 }, 00:15:51.597 "claimed": true, 00:15:51.597 "claim_type": "exclusive_write", 00:15:51.597 "zoned": false, 00:15:51.597 "supported_io_types": { 00:15:51.597 "read": true, 00:15:51.597 "write": true, 00:15:51.597 "unmap": true, 00:15:51.597 "flush": true, 00:15:51.597 "reset": true, 00:15:51.597 "nvme_admin": false, 00:15:51.597 "nvme_io": false, 00:15:51.597 "nvme_io_md": false, 00:15:51.597 "write_zeroes": true, 00:15:51.597 "zcopy": true, 00:15:51.597 "get_zone_info": false, 00:15:51.597 "zone_management": false, 00:15:51.597 "zone_append": false, 00:15:51.597 "compare": false, 00:15:51.597 "compare_and_write": false, 00:15:51.597 "abort": true, 00:15:51.597 "seek_hole": false, 00:15:51.597 "seek_data": false, 00:15:51.597 "copy": true, 00:15:51.597 "nvme_iov_md": false 00:15:51.597 }, 00:15:51.597 "memory_domains": [ 00:15:51.597 { 00:15:51.597 "dma_device_id": "system", 00:15:51.597 "dma_device_type": 1 00:15:51.597 }, 00:15:51.597 { 00:15:51.597 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.597 "dma_device_type": 2 00:15:51.597 } 00:15:51.597 ], 00:15:51.597 "driver_specific": { 00:15:51.597 "passthru": { 00:15:51.597 "name": "pt2", 00:15:51.597 "base_bdev_name": "malloc2" 00:15:51.597 } 00:15:51.597 } 00:15:51.597 }' 00:15:51.597 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.856 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.856 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.856 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.856 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.856 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.856 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.856 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.856 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.856 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.115 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.115 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.115 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:52.115 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:52.115 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:52.373 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:52.373 "name": "pt3", 00:15:52.373 "aliases": [ 00:15:52.373 "00000000-0000-0000-0000-000000000003" 00:15:52.373 ], 00:15:52.373 "product_name": "passthru", 00:15:52.373 "block_size": 512, 00:15:52.373 "num_blocks": 65536, 00:15:52.373 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:52.373 "assigned_rate_limits": { 00:15:52.373 "rw_ios_per_sec": 0, 00:15:52.373 "rw_mbytes_per_sec": 0, 00:15:52.373 "r_mbytes_per_sec": 0, 00:15:52.373 "w_mbytes_per_sec": 0 00:15:52.373 }, 00:15:52.373 "claimed": true, 00:15:52.373 "claim_type": "exclusive_write", 00:15:52.373 "zoned": false, 00:15:52.373 "supported_io_types": { 00:15:52.373 "read": true, 00:15:52.373 "write": true, 00:15:52.373 "unmap": true, 00:15:52.373 "flush": true, 00:15:52.373 "reset": true, 00:15:52.373 "nvme_admin": false, 00:15:52.373 "nvme_io": false, 00:15:52.373 "nvme_io_md": false, 00:15:52.373 "write_zeroes": true, 00:15:52.373 "zcopy": true, 00:15:52.373 "get_zone_info": false, 00:15:52.373 "zone_management": false, 00:15:52.373 "zone_append": false, 00:15:52.373 "compare": false, 00:15:52.373 "compare_and_write": false, 00:15:52.373 "abort": true, 00:15:52.373 "seek_hole": false, 00:15:52.373 "seek_data": false, 00:15:52.373 "copy": true, 00:15:52.373 "nvme_iov_md": false 00:15:52.373 }, 00:15:52.373 "memory_domains": [ 00:15:52.373 { 00:15:52.373 "dma_device_id": "system", 00:15:52.373 "dma_device_type": 1 00:15:52.373 }, 00:15:52.373 { 00:15:52.373 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:52.373 "dma_device_type": 2 00:15:52.373 } 00:15:52.373 ], 00:15:52.373 "driver_specific": { 00:15:52.373 "passthru": { 00:15:52.373 "name": "pt3", 00:15:52.373 "base_bdev_name": "malloc3" 00:15:52.374 } 00:15:52.374 } 00:15:52.374 }' 00:15:52.374 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.374 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:52.374 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:52.374 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.374 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:52.374 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:52.374 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.632 11:57:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:52.632 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:52.632 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.632 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:52.632 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:52.632 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:52.632 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:52.890 [2024-07-15 11:57:06.361641] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 96cc13ac-c03a-4291-b7b0-f684025fd116 '!=' 96cc13ac-c03a-4291-b7b0-f684025fd116 ']' 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1484990 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1484990 ']' 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1484990 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:52.890 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1484990 00:15:52.891 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:15:52.891 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:15:52.891 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1484990' 00:15:52.891 killing process with pid 1484990 00:15:52.891 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1484990 00:15:52.891 [2024-07-15 11:57:06.434614] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:52.891 [2024-07-15 11:57:06.434673] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:52.891 [2024-07-15 11:57:06.434735] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:52.891 [2024-07-15 11:57:06.434749] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e2b030 name raid_bdev1, state offline 00:15:52.891 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1484990 00:15:52.891 [2024-07-15 11:57:06.461451] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:53.148 11:57:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:15:53.148 00:15:53.148 real 0m14.326s 00:15:53.148 user 0m25.847s 00:15:53.148 sys 0m2.567s 00:15:53.148 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:15:53.148 11:57:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:53.148 ************************************ 00:15:53.148 END TEST raid_superblock_test 00:15:53.148 ************************************ 00:15:53.148 11:57:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:15:53.148 11:57:06 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:15:53.148 11:57:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:15:53.148 11:57:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:15:53.148 11:57:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:53.405 ************************************ 00:15:53.405 START TEST raid_read_error_test 00:15:53.405 ************************************ 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 read 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.d8zCUjkc9e 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1487714 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1487714 /var/tmp/spdk-raid.sock 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1487714 ']' 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:53.405 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:15:53.405 11:57:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:53.405 [2024-07-15 11:57:06.837510] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:15:53.405 [2024-07-15 11:57:06.837578] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1487714 ] 00:15:53.405 [2024-07-15 11:57:06.968151] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:53.662 [2024-07-15 11:57:07.068891] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:53.662 [2024-07-15 11:57:07.132095] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:53.662 [2024-07-15 11:57:07.132141] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:54.227 11:57:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:15:54.227 11:57:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:15:54.227 11:57:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:54.227 11:57:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:54.484 BaseBdev1_malloc 00:15:54.484 11:57:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:54.741 true 00:15:54.741 11:57:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:54.998 [2024-07-15 11:57:08.510553] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:54.998 [2024-07-15 11:57:08.510595] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:54.998 [2024-07-15 11:57:08.510616] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21084e0 00:15:54.998 [2024-07-15 11:57:08.510628] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:54.998 [2024-07-15 11:57:08.512274] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:54.998 [2024-07-15 11:57:08.512302] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:54.998 BaseBdev1 00:15:54.998 11:57:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:54.998 11:57:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:55.256 BaseBdev2_malloc 00:15:55.256 11:57:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:55.514 true 00:15:55.514 11:57:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:55.773 [2024-07-15 11:57:09.245146] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:55.773 [2024-07-15 11:57:09.245190] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:55.773 [2024-07-15 11:57:09.245208] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210d7b0 00:15:55.773 [2024-07-15 11:57:09.245221] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:55.773 [2024-07-15 11:57:09.246603] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:55.773 [2024-07-15 11:57:09.246630] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:55.773 BaseBdev2 00:15:55.773 11:57:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:15:55.773 11:57:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:56.031 BaseBdev3_malloc 00:15:56.031 11:57:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:56.290 true 00:15:56.290 11:57:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:56.548 [2024-07-15 11:57:09.971718] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:56.548 [2024-07-15 11:57:09.971766] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:56.548 [2024-07-15 11:57:09.971787] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210f8f0 00:15:56.548 [2024-07-15 11:57:09.971800] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:56.548 [2024-07-15 11:57:09.973289] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:56.548 [2024-07-15 11:57:09.973317] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:56.548 BaseBdev3 00:15:56.548 11:57:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:56.806 [2024-07-15 11:57:10.212392] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:56.806 [2024-07-15 11:57:10.213755] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:56.806 [2024-07-15 11:57:10.213825] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:56.806 [2024-07-15 11:57:10.214033] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21114b0 00:15:56.806 [2024-07-15 11:57:10.214044] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:56.806 [2024-07-15 11:57:10.214248] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21135e0 00:15:56.806 [2024-07-15 11:57:10.214401] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21114b0 00:15:56.806 [2024-07-15 11:57:10.214411] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21114b0 00:15:56.806 [2024-07-15 11:57:10.214525] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.806 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:57.125 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:57.125 "name": "raid_bdev1", 00:15:57.125 "uuid": "18d0f51e-9a4d-43e8-900e-cdcb337fbf3c", 00:15:57.125 "strip_size_kb": 64, 00:15:57.125 "state": "online", 00:15:57.125 "raid_level": "raid0", 00:15:57.125 "superblock": true, 00:15:57.125 "num_base_bdevs": 3, 00:15:57.125 "num_base_bdevs_discovered": 3, 00:15:57.125 "num_base_bdevs_operational": 3, 00:15:57.125 "base_bdevs_list": [ 00:15:57.125 { 00:15:57.125 "name": "BaseBdev1", 00:15:57.125 "uuid": "196d56ea-75d9-5e4b-acb0-dcb52bf44c2b", 00:15:57.125 "is_configured": true, 00:15:57.125 "data_offset": 2048, 00:15:57.125 "data_size": 63488 00:15:57.125 }, 00:15:57.125 { 00:15:57.125 "name": "BaseBdev2", 00:15:57.125 "uuid": "0c001fd4-0074-513a-85d1-e63e69ca721b", 00:15:57.125 "is_configured": true, 00:15:57.125 "data_offset": 2048, 00:15:57.125 "data_size": 63488 00:15:57.125 }, 00:15:57.125 { 00:15:57.125 "name": "BaseBdev3", 00:15:57.125 "uuid": "f8ae2f2e-9612-546d-a87b-32076e34239e", 00:15:57.125 "is_configured": true, 00:15:57.125 "data_offset": 2048, 00:15:57.125 "data_size": 63488 00:15:57.125 } 00:15:57.125 ] 00:15:57.125 }' 00:15:57.125 11:57:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:57.125 11:57:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.692 11:57:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:15:57.692 11:57:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:57.692 [2024-07-15 11:57:11.207282] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2116110 00:15:58.628 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.886 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:59.146 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.146 "name": "raid_bdev1", 00:15:59.146 "uuid": "18d0f51e-9a4d-43e8-900e-cdcb337fbf3c", 00:15:59.146 "strip_size_kb": 64, 00:15:59.146 "state": "online", 00:15:59.146 "raid_level": "raid0", 00:15:59.146 "superblock": true, 00:15:59.146 "num_base_bdevs": 3, 00:15:59.146 "num_base_bdevs_discovered": 3, 00:15:59.146 "num_base_bdevs_operational": 3, 00:15:59.146 "base_bdevs_list": [ 00:15:59.146 { 00:15:59.146 "name": "BaseBdev1", 00:15:59.146 "uuid": "196d56ea-75d9-5e4b-acb0-dcb52bf44c2b", 00:15:59.146 "is_configured": true, 00:15:59.146 "data_offset": 2048, 00:15:59.146 "data_size": 63488 00:15:59.146 }, 00:15:59.146 { 00:15:59.146 "name": "BaseBdev2", 00:15:59.146 "uuid": "0c001fd4-0074-513a-85d1-e63e69ca721b", 00:15:59.146 "is_configured": true, 00:15:59.146 "data_offset": 2048, 00:15:59.146 "data_size": 63488 00:15:59.146 }, 00:15:59.146 { 00:15:59.146 "name": "BaseBdev3", 00:15:59.146 "uuid": "f8ae2f2e-9612-546d-a87b-32076e34239e", 00:15:59.146 "is_configured": true, 00:15:59.146 "data_offset": 2048, 00:15:59.146 "data_size": 63488 00:15:59.146 } 00:15:59.146 ] 00:15:59.146 }' 00:15:59.146 11:57:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.146 11:57:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:59.712 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:59.971 [2024-07-15 11:57:13.501331] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:59.971 [2024-07-15 11:57:13.501371] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:59.971 [2024-07-15 11:57:13.504544] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:59.971 [2024-07-15 11:57:13.504583] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:59.971 [2024-07-15 11:57:13.504622] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:59.971 [2024-07-15 11:57:13.504633] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21114b0 name raid_bdev1, state offline 00:15:59.971 0 00:15:59.971 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1487714 00:15:59.971 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1487714 ']' 00:15:59.971 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1487714 00:15:59.971 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:15:59.971 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:15:59.971 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1487714 00:16:00.230 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:00.230 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:00.230 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1487714' 00:16:00.230 killing process with pid 1487714 00:16:00.230 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1487714 00:16:00.230 [2024-07-15 11:57:13.574655] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:00.230 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1487714 00:16:00.230 [2024-07-15 11:57:13.598711] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:00.489 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.d8zCUjkc9e 00:16:00.489 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:00.489 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:00.489 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:16:00.489 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:16:00.489 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:00.489 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:00.489 11:57:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:16:00.489 00:16:00.490 real 0m7.087s 00:16:00.490 user 0m11.235s 00:16:00.490 sys 0m1.260s 00:16:00.490 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:00.490 11:57:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.490 ************************************ 00:16:00.490 END TEST raid_read_error_test 00:16:00.490 ************************************ 00:16:00.490 11:57:13 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:00.490 11:57:13 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:16:00.490 11:57:13 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:00.490 11:57:13 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:00.490 11:57:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:00.490 ************************************ 00:16:00.490 START TEST raid_write_error_test 00:16:00.490 ************************************ 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 3 write 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.NjoFt0pWoH 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1488699 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1488699 /var/tmp/spdk-raid.sock 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1488699 ']' 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:00.490 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:00.490 11:57:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.490 [2024-07-15 11:57:14.014707] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:16:00.490 [2024-07-15 11:57:14.014779] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1488699 ] 00:16:00.749 [2024-07-15 11:57:14.145050] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:00.749 [2024-07-15 11:57:14.246592] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:00.749 [2024-07-15 11:57:14.311503] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:00.749 [2024-07-15 11:57:14.311541] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:01.686 11:57:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:01.686 11:57:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:16:01.686 11:57:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:01.686 11:57:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:01.686 BaseBdev1_malloc 00:16:01.686 11:57:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:01.946 true 00:16:01.946 11:57:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:01.946 [2024-07-15 11:57:15.541039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:01.946 [2024-07-15 11:57:15.541089] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:01.946 [2024-07-15 11:57:15.541107] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23a04e0 00:16:01.946 [2024-07-15 11:57:15.541119] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:02.205 [2024-07-15 11:57:15.542734] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:02.205 [2024-07-15 11:57:15.542762] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:02.205 BaseBdev1 00:16:02.205 11:57:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:02.205 11:57:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:02.205 BaseBdev2_malloc 00:16:02.205 11:57:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:02.463 true 00:16:02.463 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:02.722 [2024-07-15 11:57:16.167339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:02.722 [2024-07-15 11:57:16.167385] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:02.722 [2024-07-15 11:57:16.167404] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23a57b0 00:16:02.722 [2024-07-15 11:57:16.167416] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:02.722 [2024-07-15 11:57:16.168917] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:02.722 [2024-07-15 11:57:16.168944] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:02.722 BaseBdev2 00:16:02.722 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:02.722 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:02.980 BaseBdev3_malloc 00:16:02.980 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:02.980 true 00:16:02.980 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:03.238 [2024-07-15 11:57:16.705417] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:03.238 [2024-07-15 11:57:16.705458] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:03.238 [2024-07-15 11:57:16.705479] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23a78f0 00:16:03.238 [2024-07-15 11:57:16.705491] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:03.238 [2024-07-15 11:57:16.706961] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:03.238 [2024-07-15 11:57:16.706989] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:03.238 BaseBdev3 00:16:03.238 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:03.498 [2024-07-15 11:57:16.873891] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:03.498 [2024-07-15 11:57:16.875067] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:03.498 [2024-07-15 11:57:16.875134] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:03.498 [2024-07-15 11:57:16.875328] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x23a94b0 00:16:03.498 [2024-07-15 11:57:16.875340] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:03.498 [2024-07-15 11:57:16.875523] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ab5e0 00:16:03.498 [2024-07-15 11:57:16.875664] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x23a94b0 00:16:03.498 [2024-07-15 11:57:16.875674] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x23a94b0 00:16:03.498 [2024-07-15 11:57:16.875780] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:03.498 11:57:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:03.758 11:57:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.758 "name": "raid_bdev1", 00:16:03.758 "uuid": "4812d6a5-9c08-4fab-986f-228d584e605b", 00:16:03.758 "strip_size_kb": 64, 00:16:03.758 "state": "online", 00:16:03.758 "raid_level": "raid0", 00:16:03.758 "superblock": true, 00:16:03.758 "num_base_bdevs": 3, 00:16:03.758 "num_base_bdevs_discovered": 3, 00:16:03.758 "num_base_bdevs_operational": 3, 00:16:03.758 "base_bdevs_list": [ 00:16:03.758 { 00:16:03.758 "name": "BaseBdev1", 00:16:03.758 "uuid": "ce3e464b-25bb-5916-8c78-0a4b6ce64814", 00:16:03.758 "is_configured": true, 00:16:03.758 "data_offset": 2048, 00:16:03.758 "data_size": 63488 00:16:03.758 }, 00:16:03.758 { 00:16:03.758 "name": "BaseBdev2", 00:16:03.758 "uuid": "42071b6d-a829-56e1-9358-0c3375c1d0e3", 00:16:03.758 "is_configured": true, 00:16:03.758 "data_offset": 2048, 00:16:03.758 "data_size": 63488 00:16:03.758 }, 00:16:03.758 { 00:16:03.758 "name": "BaseBdev3", 00:16:03.758 "uuid": "3b6bc933-85b5-5682-a598-52da0fba644b", 00:16:03.758 "is_configured": true, 00:16:03.758 "data_offset": 2048, 00:16:03.758 "data_size": 63488 00:16:03.758 } 00:16:03.758 ] 00:16:03.758 }' 00:16:03.758 11:57:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.758 11:57:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.363 11:57:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:04.363 11:57:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:04.363 [2024-07-15 11:57:17.872819] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23ae110 00:16:05.301 11:57:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:05.560 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:05.820 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.820 "name": "raid_bdev1", 00:16:05.820 "uuid": "4812d6a5-9c08-4fab-986f-228d584e605b", 00:16:05.820 "strip_size_kb": 64, 00:16:05.820 "state": "online", 00:16:05.820 "raid_level": "raid0", 00:16:05.820 "superblock": true, 00:16:05.820 "num_base_bdevs": 3, 00:16:05.820 "num_base_bdevs_discovered": 3, 00:16:05.820 "num_base_bdevs_operational": 3, 00:16:05.820 "base_bdevs_list": [ 00:16:05.820 { 00:16:05.820 "name": "BaseBdev1", 00:16:05.820 "uuid": "ce3e464b-25bb-5916-8c78-0a4b6ce64814", 00:16:05.820 "is_configured": true, 00:16:05.820 "data_offset": 2048, 00:16:05.820 "data_size": 63488 00:16:05.820 }, 00:16:05.820 { 00:16:05.820 "name": "BaseBdev2", 00:16:05.820 "uuid": "42071b6d-a829-56e1-9358-0c3375c1d0e3", 00:16:05.820 "is_configured": true, 00:16:05.820 "data_offset": 2048, 00:16:05.820 "data_size": 63488 00:16:05.820 }, 00:16:05.820 { 00:16:05.820 "name": "BaseBdev3", 00:16:05.820 "uuid": "3b6bc933-85b5-5682-a598-52da0fba644b", 00:16:05.820 "is_configured": true, 00:16:05.820 "data_offset": 2048, 00:16:05.820 "data_size": 63488 00:16:05.820 } 00:16:05.820 ] 00:16:05.820 }' 00:16:05.820 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.820 11:57:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.388 11:57:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:06.648 [2024-07-15 11:57:20.159281] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:06.648 [2024-07-15 11:57:20.159321] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:06.648 [2024-07-15 11:57:20.162497] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:06.648 [2024-07-15 11:57:20.162537] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:06.648 [2024-07-15 11:57:20.162570] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:06.648 [2024-07-15 11:57:20.162581] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x23a94b0 name raid_bdev1, state offline 00:16:06.648 0 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1488699 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1488699 ']' 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1488699 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1488699 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1488699' 00:16:06.648 killing process with pid 1488699 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1488699 00:16:06.648 [2024-07-15 11:57:20.231325] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:06.648 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1488699 00:16:06.907 [2024-07-15 11:57:20.255337] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.NjoFt0pWoH 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:16:06.907 00:16:06.907 real 0m6.569s 00:16:06.907 user 0m10.308s 00:16:06.907 sys 0m1.152s 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:06.907 11:57:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.907 ************************************ 00:16:06.907 END TEST raid_write_error_test 00:16:06.907 ************************************ 00:16:07.166 11:57:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:07.166 11:57:20 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:07.166 11:57:20 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:16:07.166 11:57:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:07.166 11:57:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:07.166 11:57:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:07.166 ************************************ 00:16:07.166 START TEST raid_state_function_test 00:16:07.166 ************************************ 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 false 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1489672 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1489672' 00:16:07.166 Process raid pid: 1489672 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1489672 /var/tmp/spdk-raid.sock 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1489672 ']' 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:07.166 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:07.166 11:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.166 [2024-07-15 11:57:20.704163] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:16:07.166 [2024-07-15 11:57:20.704298] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:07.425 [2024-07-15 11:57:20.900224] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:07.425 [2024-07-15 11:57:20.997339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:07.683 [2024-07-15 11:57:21.057809] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:07.683 [2024-07-15 11:57:21.057841] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:08.251 [2024-07-15 11:57:21.814002] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:08.251 [2024-07-15 11:57:21.814046] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:08.251 [2024-07-15 11:57:21.814056] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:08.251 [2024-07-15 11:57:21.814068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:08.251 [2024-07-15 11:57:21.814076] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:08.251 [2024-07-15 11:57:21.814087] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.251 11:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.509 11:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.509 "name": "Existed_Raid", 00:16:08.509 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.509 "strip_size_kb": 64, 00:16:08.509 "state": "configuring", 00:16:08.509 "raid_level": "concat", 00:16:08.509 "superblock": false, 00:16:08.509 "num_base_bdevs": 3, 00:16:08.509 "num_base_bdevs_discovered": 0, 00:16:08.509 "num_base_bdevs_operational": 3, 00:16:08.509 "base_bdevs_list": [ 00:16:08.509 { 00:16:08.509 "name": "BaseBdev1", 00:16:08.509 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.509 "is_configured": false, 00:16:08.509 "data_offset": 0, 00:16:08.509 "data_size": 0 00:16:08.509 }, 00:16:08.509 { 00:16:08.509 "name": "BaseBdev2", 00:16:08.509 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.509 "is_configured": false, 00:16:08.509 "data_offset": 0, 00:16:08.509 "data_size": 0 00:16:08.509 }, 00:16:08.509 { 00:16:08.509 "name": "BaseBdev3", 00:16:08.509 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.509 "is_configured": false, 00:16:08.509 "data_offset": 0, 00:16:08.509 "data_size": 0 00:16:08.509 } 00:16:08.509 ] 00:16:08.509 }' 00:16:08.509 11:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.509 11:57:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.137 11:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:09.396 [2024-07-15 11:57:22.956877] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:09.396 [2024-07-15 11:57:22.956909] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc79b00 name Existed_Raid, state configuring 00:16:09.396 11:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:09.655 [2024-07-15 11:57:23.201536] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:09.655 [2024-07-15 11:57:23.201568] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:09.655 [2024-07-15 11:57:23.201578] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:09.655 [2024-07-15 11:57:23.201589] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:09.655 [2024-07-15 11:57:23.201597] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:09.655 [2024-07-15 11:57:23.201609] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:09.655 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:09.960 [2024-07-15 11:57:23.457192] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:09.960 BaseBdev1 00:16:09.960 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:09.960 11:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:09.960 11:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:09.960 11:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:09.960 11:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:09.960 11:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:09.960 11:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:10.218 11:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:10.477 [ 00:16:10.477 { 00:16:10.477 "name": "BaseBdev1", 00:16:10.477 "aliases": [ 00:16:10.477 "8c9823b7-d73a-4bdf-8598-053966ebc2e7" 00:16:10.477 ], 00:16:10.477 "product_name": "Malloc disk", 00:16:10.477 "block_size": 512, 00:16:10.477 "num_blocks": 65536, 00:16:10.477 "uuid": "8c9823b7-d73a-4bdf-8598-053966ebc2e7", 00:16:10.477 "assigned_rate_limits": { 00:16:10.477 "rw_ios_per_sec": 0, 00:16:10.477 "rw_mbytes_per_sec": 0, 00:16:10.477 "r_mbytes_per_sec": 0, 00:16:10.477 "w_mbytes_per_sec": 0 00:16:10.477 }, 00:16:10.477 "claimed": true, 00:16:10.477 "claim_type": "exclusive_write", 00:16:10.477 "zoned": false, 00:16:10.477 "supported_io_types": { 00:16:10.477 "read": true, 00:16:10.477 "write": true, 00:16:10.477 "unmap": true, 00:16:10.477 "flush": true, 00:16:10.477 "reset": true, 00:16:10.477 "nvme_admin": false, 00:16:10.477 "nvme_io": false, 00:16:10.477 "nvme_io_md": false, 00:16:10.477 "write_zeroes": true, 00:16:10.477 "zcopy": true, 00:16:10.477 "get_zone_info": false, 00:16:10.477 "zone_management": false, 00:16:10.477 "zone_append": false, 00:16:10.477 "compare": false, 00:16:10.477 "compare_and_write": false, 00:16:10.477 "abort": true, 00:16:10.477 "seek_hole": false, 00:16:10.477 "seek_data": false, 00:16:10.477 "copy": true, 00:16:10.477 "nvme_iov_md": false 00:16:10.477 }, 00:16:10.477 "memory_domains": [ 00:16:10.477 { 00:16:10.477 "dma_device_id": "system", 00:16:10.477 "dma_device_type": 1 00:16:10.477 }, 00:16:10.477 { 00:16:10.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:10.477 "dma_device_type": 2 00:16:10.477 } 00:16:10.477 ], 00:16:10.477 "driver_specific": {} 00:16:10.477 } 00:16:10.477 ] 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.477 11:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:10.735 11:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:10.735 "name": "Existed_Raid", 00:16:10.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:10.735 "strip_size_kb": 64, 00:16:10.735 "state": "configuring", 00:16:10.735 "raid_level": "concat", 00:16:10.735 "superblock": false, 00:16:10.735 "num_base_bdevs": 3, 00:16:10.735 "num_base_bdevs_discovered": 1, 00:16:10.735 "num_base_bdevs_operational": 3, 00:16:10.735 "base_bdevs_list": [ 00:16:10.735 { 00:16:10.735 "name": "BaseBdev1", 00:16:10.735 "uuid": "8c9823b7-d73a-4bdf-8598-053966ebc2e7", 00:16:10.735 "is_configured": true, 00:16:10.735 "data_offset": 0, 00:16:10.735 "data_size": 65536 00:16:10.735 }, 00:16:10.735 { 00:16:10.735 "name": "BaseBdev2", 00:16:10.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:10.735 "is_configured": false, 00:16:10.735 "data_offset": 0, 00:16:10.735 "data_size": 0 00:16:10.735 }, 00:16:10.735 { 00:16:10.735 "name": "BaseBdev3", 00:16:10.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:10.735 "is_configured": false, 00:16:10.735 "data_offset": 0, 00:16:10.735 "data_size": 0 00:16:10.735 } 00:16:10.735 ] 00:16:10.735 }' 00:16:10.735 11:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:10.735 11:57:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.670 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:11.929 [2024-07-15 11:57:25.322135] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:11.929 [2024-07-15 11:57:25.322172] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc79390 name Existed_Raid, state configuring 00:16:11.929 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:12.189 [2024-07-15 11:57:25.570845] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:12.189 [2024-07-15 11:57:25.572266] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:12.189 [2024-07-15 11:57:25.572300] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:12.189 [2024-07-15 11:57:25.572310] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:12.189 [2024-07-15 11:57:25.572321] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.189 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.448 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.448 "name": "Existed_Raid", 00:16:12.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.448 "strip_size_kb": 64, 00:16:12.448 "state": "configuring", 00:16:12.448 "raid_level": "concat", 00:16:12.448 "superblock": false, 00:16:12.448 "num_base_bdevs": 3, 00:16:12.448 "num_base_bdevs_discovered": 1, 00:16:12.448 "num_base_bdevs_operational": 3, 00:16:12.448 "base_bdevs_list": [ 00:16:12.448 { 00:16:12.449 "name": "BaseBdev1", 00:16:12.449 "uuid": "8c9823b7-d73a-4bdf-8598-053966ebc2e7", 00:16:12.449 "is_configured": true, 00:16:12.449 "data_offset": 0, 00:16:12.449 "data_size": 65536 00:16:12.449 }, 00:16:12.449 { 00:16:12.449 "name": "BaseBdev2", 00:16:12.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.449 "is_configured": false, 00:16:12.449 "data_offset": 0, 00:16:12.449 "data_size": 0 00:16:12.449 }, 00:16:12.449 { 00:16:12.449 "name": "BaseBdev3", 00:16:12.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.449 "is_configured": false, 00:16:12.449 "data_offset": 0, 00:16:12.449 "data_size": 0 00:16:12.449 } 00:16:12.449 ] 00:16:12.449 }' 00:16:12.449 11:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.449 11:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:13.387 11:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:13.646 [2024-07-15 11:57:26.990014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:13.646 BaseBdev2 00:16:13.646 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:13.646 11:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:13.646 11:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:13.646 11:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:13.646 11:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:13.646 11:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:13.646 11:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:13.905 11:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:13.906 [ 00:16:13.906 { 00:16:13.906 "name": "BaseBdev2", 00:16:13.906 "aliases": [ 00:16:13.906 "3a2e5675-29ee-44ab-8d4f-46d6444d0a63" 00:16:13.906 ], 00:16:13.906 "product_name": "Malloc disk", 00:16:13.906 "block_size": 512, 00:16:13.906 "num_blocks": 65536, 00:16:13.906 "uuid": "3a2e5675-29ee-44ab-8d4f-46d6444d0a63", 00:16:13.906 "assigned_rate_limits": { 00:16:13.906 "rw_ios_per_sec": 0, 00:16:13.906 "rw_mbytes_per_sec": 0, 00:16:13.906 "r_mbytes_per_sec": 0, 00:16:13.906 "w_mbytes_per_sec": 0 00:16:13.906 }, 00:16:13.906 "claimed": true, 00:16:13.906 "claim_type": "exclusive_write", 00:16:13.906 "zoned": false, 00:16:13.906 "supported_io_types": { 00:16:13.906 "read": true, 00:16:13.906 "write": true, 00:16:13.906 "unmap": true, 00:16:13.906 "flush": true, 00:16:13.906 "reset": true, 00:16:13.906 "nvme_admin": false, 00:16:13.906 "nvme_io": false, 00:16:13.906 "nvme_io_md": false, 00:16:13.906 "write_zeroes": true, 00:16:13.906 "zcopy": true, 00:16:13.906 "get_zone_info": false, 00:16:13.906 "zone_management": false, 00:16:13.906 "zone_append": false, 00:16:13.906 "compare": false, 00:16:13.906 "compare_and_write": false, 00:16:13.906 "abort": true, 00:16:13.906 "seek_hole": false, 00:16:13.906 "seek_data": false, 00:16:13.906 "copy": true, 00:16:13.906 "nvme_iov_md": false 00:16:13.906 }, 00:16:13.906 "memory_domains": [ 00:16:13.906 { 00:16:13.906 "dma_device_id": "system", 00:16:13.906 "dma_device_type": 1 00:16:13.906 }, 00:16:13.906 { 00:16:13.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:13.906 "dma_device_type": 2 00:16:13.906 } 00:16:13.906 ], 00:16:13.906 "driver_specific": {} 00:16:13.906 } 00:16:13.906 ] 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.165 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.165 "name": "Existed_Raid", 00:16:14.165 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.165 "strip_size_kb": 64, 00:16:14.165 "state": "configuring", 00:16:14.165 "raid_level": "concat", 00:16:14.166 "superblock": false, 00:16:14.166 "num_base_bdevs": 3, 00:16:14.166 "num_base_bdevs_discovered": 2, 00:16:14.166 "num_base_bdevs_operational": 3, 00:16:14.166 "base_bdevs_list": [ 00:16:14.166 { 00:16:14.166 "name": "BaseBdev1", 00:16:14.166 "uuid": "8c9823b7-d73a-4bdf-8598-053966ebc2e7", 00:16:14.166 "is_configured": true, 00:16:14.166 "data_offset": 0, 00:16:14.166 "data_size": 65536 00:16:14.166 }, 00:16:14.166 { 00:16:14.166 "name": "BaseBdev2", 00:16:14.166 "uuid": "3a2e5675-29ee-44ab-8d4f-46d6444d0a63", 00:16:14.166 "is_configured": true, 00:16:14.166 "data_offset": 0, 00:16:14.166 "data_size": 65536 00:16:14.166 }, 00:16:14.166 { 00:16:14.166 "name": "BaseBdev3", 00:16:14.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.166 "is_configured": false, 00:16:14.166 "data_offset": 0, 00:16:14.166 "data_size": 0 00:16:14.166 } 00:16:14.166 ] 00:16:14.166 }' 00:16:14.166 11:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.425 11:57:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.991 11:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:15.250 [2024-07-15 11:57:28.601719] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:15.250 [2024-07-15 11:57:28.601751] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc7a480 00:16:15.250 [2024-07-15 11:57:28.601760] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:16:15.250 [2024-07-15 11:57:28.601977] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9805d0 00:16:15.250 [2024-07-15 11:57:28.602095] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc7a480 00:16:15.250 [2024-07-15 11:57:28.602105] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc7a480 00:16:15.250 [2024-07-15 11:57:28.602264] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:15.250 BaseBdev3 00:16:15.250 11:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:15.250 11:57:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:15.250 11:57:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:15.251 11:57:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:15.251 11:57:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:15.251 11:57:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:15.251 11:57:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.510 11:57:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:15.510 [ 00:16:15.510 { 00:16:15.510 "name": "BaseBdev3", 00:16:15.510 "aliases": [ 00:16:15.510 "abdf2104-8002-47de-88c1-80e4f3c60266" 00:16:15.510 ], 00:16:15.510 "product_name": "Malloc disk", 00:16:15.510 "block_size": 512, 00:16:15.510 "num_blocks": 65536, 00:16:15.510 "uuid": "abdf2104-8002-47de-88c1-80e4f3c60266", 00:16:15.510 "assigned_rate_limits": { 00:16:15.510 "rw_ios_per_sec": 0, 00:16:15.510 "rw_mbytes_per_sec": 0, 00:16:15.510 "r_mbytes_per_sec": 0, 00:16:15.510 "w_mbytes_per_sec": 0 00:16:15.510 }, 00:16:15.510 "claimed": true, 00:16:15.510 "claim_type": "exclusive_write", 00:16:15.510 "zoned": false, 00:16:15.510 "supported_io_types": { 00:16:15.510 "read": true, 00:16:15.510 "write": true, 00:16:15.510 "unmap": true, 00:16:15.510 "flush": true, 00:16:15.510 "reset": true, 00:16:15.510 "nvme_admin": false, 00:16:15.510 "nvme_io": false, 00:16:15.510 "nvme_io_md": false, 00:16:15.510 "write_zeroes": true, 00:16:15.510 "zcopy": true, 00:16:15.510 "get_zone_info": false, 00:16:15.510 "zone_management": false, 00:16:15.510 "zone_append": false, 00:16:15.510 "compare": false, 00:16:15.510 "compare_and_write": false, 00:16:15.510 "abort": true, 00:16:15.510 "seek_hole": false, 00:16:15.510 "seek_data": false, 00:16:15.510 "copy": true, 00:16:15.510 "nvme_iov_md": false 00:16:15.510 }, 00:16:15.510 "memory_domains": [ 00:16:15.510 { 00:16:15.510 "dma_device_id": "system", 00:16:15.510 "dma_device_type": 1 00:16:15.510 }, 00:16:15.510 { 00:16:15.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.510 "dma_device_type": 2 00:16:15.510 } 00:16:15.510 ], 00:16:15.510 "driver_specific": {} 00:16:15.510 } 00:16:15.510 ] 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.770 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.032 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.032 "name": "Existed_Raid", 00:16:16.032 "uuid": "25704efc-3b9d-49af-8e73-d017d1e4408f", 00:16:16.032 "strip_size_kb": 64, 00:16:16.032 "state": "online", 00:16:16.032 "raid_level": "concat", 00:16:16.032 "superblock": false, 00:16:16.032 "num_base_bdevs": 3, 00:16:16.032 "num_base_bdevs_discovered": 3, 00:16:16.032 "num_base_bdevs_operational": 3, 00:16:16.032 "base_bdevs_list": [ 00:16:16.032 { 00:16:16.032 "name": "BaseBdev1", 00:16:16.032 "uuid": "8c9823b7-d73a-4bdf-8598-053966ebc2e7", 00:16:16.032 "is_configured": true, 00:16:16.032 "data_offset": 0, 00:16:16.032 "data_size": 65536 00:16:16.032 }, 00:16:16.032 { 00:16:16.032 "name": "BaseBdev2", 00:16:16.032 "uuid": "3a2e5675-29ee-44ab-8d4f-46d6444d0a63", 00:16:16.032 "is_configured": true, 00:16:16.032 "data_offset": 0, 00:16:16.032 "data_size": 65536 00:16:16.032 }, 00:16:16.032 { 00:16:16.032 "name": "BaseBdev3", 00:16:16.032 "uuid": "abdf2104-8002-47de-88c1-80e4f3c60266", 00:16:16.032 "is_configured": true, 00:16:16.032 "data_offset": 0, 00:16:16.032 "data_size": 65536 00:16:16.032 } 00:16:16.032 ] 00:16:16.032 }' 00:16:16.032 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.032 11:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.600 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:16.600 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:16.600 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:16.600 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:16.600 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:16.600 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:16.600 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:16.600 11:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:16.859 [2024-07-15 11:57:30.206239] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:16.859 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:16.859 "name": "Existed_Raid", 00:16:16.859 "aliases": [ 00:16:16.859 "25704efc-3b9d-49af-8e73-d017d1e4408f" 00:16:16.859 ], 00:16:16.859 "product_name": "Raid Volume", 00:16:16.859 "block_size": 512, 00:16:16.859 "num_blocks": 196608, 00:16:16.859 "uuid": "25704efc-3b9d-49af-8e73-d017d1e4408f", 00:16:16.859 "assigned_rate_limits": { 00:16:16.859 "rw_ios_per_sec": 0, 00:16:16.859 "rw_mbytes_per_sec": 0, 00:16:16.859 "r_mbytes_per_sec": 0, 00:16:16.859 "w_mbytes_per_sec": 0 00:16:16.859 }, 00:16:16.859 "claimed": false, 00:16:16.859 "zoned": false, 00:16:16.859 "supported_io_types": { 00:16:16.859 "read": true, 00:16:16.859 "write": true, 00:16:16.859 "unmap": true, 00:16:16.859 "flush": true, 00:16:16.859 "reset": true, 00:16:16.859 "nvme_admin": false, 00:16:16.859 "nvme_io": false, 00:16:16.859 "nvme_io_md": false, 00:16:16.859 "write_zeroes": true, 00:16:16.859 "zcopy": false, 00:16:16.859 "get_zone_info": false, 00:16:16.859 "zone_management": false, 00:16:16.859 "zone_append": false, 00:16:16.859 "compare": false, 00:16:16.859 "compare_and_write": false, 00:16:16.859 "abort": false, 00:16:16.859 "seek_hole": false, 00:16:16.859 "seek_data": false, 00:16:16.859 "copy": false, 00:16:16.859 "nvme_iov_md": false 00:16:16.859 }, 00:16:16.859 "memory_domains": [ 00:16:16.859 { 00:16:16.859 "dma_device_id": "system", 00:16:16.859 "dma_device_type": 1 00:16:16.859 }, 00:16:16.859 { 00:16:16.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.859 "dma_device_type": 2 00:16:16.859 }, 00:16:16.859 { 00:16:16.859 "dma_device_id": "system", 00:16:16.859 "dma_device_type": 1 00:16:16.859 }, 00:16:16.859 { 00:16:16.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.859 "dma_device_type": 2 00:16:16.859 }, 00:16:16.859 { 00:16:16.859 "dma_device_id": "system", 00:16:16.859 "dma_device_type": 1 00:16:16.859 }, 00:16:16.859 { 00:16:16.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:16.859 "dma_device_type": 2 00:16:16.859 } 00:16:16.859 ], 00:16:16.859 "driver_specific": { 00:16:16.859 "raid": { 00:16:16.859 "uuid": "25704efc-3b9d-49af-8e73-d017d1e4408f", 00:16:16.859 "strip_size_kb": 64, 00:16:16.859 "state": "online", 00:16:16.859 "raid_level": "concat", 00:16:16.859 "superblock": false, 00:16:16.859 "num_base_bdevs": 3, 00:16:16.859 "num_base_bdevs_discovered": 3, 00:16:16.859 "num_base_bdevs_operational": 3, 00:16:16.859 "base_bdevs_list": [ 00:16:16.859 { 00:16:16.859 "name": "BaseBdev1", 00:16:16.859 "uuid": "8c9823b7-d73a-4bdf-8598-053966ebc2e7", 00:16:16.859 "is_configured": true, 00:16:16.859 "data_offset": 0, 00:16:16.859 "data_size": 65536 00:16:16.859 }, 00:16:16.859 { 00:16:16.859 "name": "BaseBdev2", 00:16:16.859 "uuid": "3a2e5675-29ee-44ab-8d4f-46d6444d0a63", 00:16:16.859 "is_configured": true, 00:16:16.859 "data_offset": 0, 00:16:16.859 "data_size": 65536 00:16:16.859 }, 00:16:16.859 { 00:16:16.859 "name": "BaseBdev3", 00:16:16.859 "uuid": "abdf2104-8002-47de-88c1-80e4f3c60266", 00:16:16.859 "is_configured": true, 00:16:16.859 "data_offset": 0, 00:16:16.859 "data_size": 65536 00:16:16.859 } 00:16:16.859 ] 00:16:16.859 } 00:16:16.859 } 00:16:16.859 }' 00:16:16.859 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:16.859 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:16.859 BaseBdev2 00:16:16.859 BaseBdev3' 00:16:16.859 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:16.859 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:16.859 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.119 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.119 "name": "BaseBdev1", 00:16:17.119 "aliases": [ 00:16:17.119 "8c9823b7-d73a-4bdf-8598-053966ebc2e7" 00:16:17.119 ], 00:16:17.119 "product_name": "Malloc disk", 00:16:17.119 "block_size": 512, 00:16:17.119 "num_blocks": 65536, 00:16:17.119 "uuid": "8c9823b7-d73a-4bdf-8598-053966ebc2e7", 00:16:17.119 "assigned_rate_limits": { 00:16:17.119 "rw_ios_per_sec": 0, 00:16:17.119 "rw_mbytes_per_sec": 0, 00:16:17.119 "r_mbytes_per_sec": 0, 00:16:17.119 "w_mbytes_per_sec": 0 00:16:17.119 }, 00:16:17.119 "claimed": true, 00:16:17.119 "claim_type": "exclusive_write", 00:16:17.119 "zoned": false, 00:16:17.119 "supported_io_types": { 00:16:17.119 "read": true, 00:16:17.119 "write": true, 00:16:17.119 "unmap": true, 00:16:17.119 "flush": true, 00:16:17.119 "reset": true, 00:16:17.119 "nvme_admin": false, 00:16:17.119 "nvme_io": false, 00:16:17.119 "nvme_io_md": false, 00:16:17.119 "write_zeroes": true, 00:16:17.119 "zcopy": true, 00:16:17.119 "get_zone_info": false, 00:16:17.119 "zone_management": false, 00:16:17.119 "zone_append": false, 00:16:17.119 "compare": false, 00:16:17.119 "compare_and_write": false, 00:16:17.119 "abort": true, 00:16:17.119 "seek_hole": false, 00:16:17.119 "seek_data": false, 00:16:17.119 "copy": true, 00:16:17.119 "nvme_iov_md": false 00:16:17.119 }, 00:16:17.119 "memory_domains": [ 00:16:17.119 { 00:16:17.119 "dma_device_id": "system", 00:16:17.119 "dma_device_type": 1 00:16:17.119 }, 00:16:17.119 { 00:16:17.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.119 "dma_device_type": 2 00:16:17.119 } 00:16:17.119 ], 00:16:17.119 "driver_specific": {} 00:16:17.119 }' 00:16:17.119 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.119 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.119 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.119 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.119 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.119 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.119 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.379 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.379 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.379 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.379 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.379 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:17.379 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.379 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:17.379 11:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:17.638 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:17.638 "name": "BaseBdev2", 00:16:17.638 "aliases": [ 00:16:17.638 "3a2e5675-29ee-44ab-8d4f-46d6444d0a63" 00:16:17.638 ], 00:16:17.638 "product_name": "Malloc disk", 00:16:17.638 "block_size": 512, 00:16:17.638 "num_blocks": 65536, 00:16:17.638 "uuid": "3a2e5675-29ee-44ab-8d4f-46d6444d0a63", 00:16:17.638 "assigned_rate_limits": { 00:16:17.638 "rw_ios_per_sec": 0, 00:16:17.638 "rw_mbytes_per_sec": 0, 00:16:17.638 "r_mbytes_per_sec": 0, 00:16:17.638 "w_mbytes_per_sec": 0 00:16:17.638 }, 00:16:17.638 "claimed": true, 00:16:17.638 "claim_type": "exclusive_write", 00:16:17.638 "zoned": false, 00:16:17.638 "supported_io_types": { 00:16:17.638 "read": true, 00:16:17.638 "write": true, 00:16:17.638 "unmap": true, 00:16:17.638 "flush": true, 00:16:17.638 "reset": true, 00:16:17.638 "nvme_admin": false, 00:16:17.638 "nvme_io": false, 00:16:17.638 "nvme_io_md": false, 00:16:17.638 "write_zeroes": true, 00:16:17.638 "zcopy": true, 00:16:17.638 "get_zone_info": false, 00:16:17.638 "zone_management": false, 00:16:17.638 "zone_append": false, 00:16:17.638 "compare": false, 00:16:17.638 "compare_and_write": false, 00:16:17.638 "abort": true, 00:16:17.638 "seek_hole": false, 00:16:17.638 "seek_data": false, 00:16:17.638 "copy": true, 00:16:17.638 "nvme_iov_md": false 00:16:17.638 }, 00:16:17.638 "memory_domains": [ 00:16:17.638 { 00:16:17.638 "dma_device_id": "system", 00:16:17.638 "dma_device_type": 1 00:16:17.638 }, 00:16:17.638 { 00:16:17.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:17.638 "dma_device_type": 2 00:16:17.638 } 00:16:17.638 ], 00:16:17.638 "driver_specific": {} 00:16:17.638 }' 00:16:17.638 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.638 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:17.638 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:17.639 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:17.898 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:18.157 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:18.157 "name": "BaseBdev3", 00:16:18.157 "aliases": [ 00:16:18.157 "abdf2104-8002-47de-88c1-80e4f3c60266" 00:16:18.157 ], 00:16:18.157 "product_name": "Malloc disk", 00:16:18.157 "block_size": 512, 00:16:18.157 "num_blocks": 65536, 00:16:18.157 "uuid": "abdf2104-8002-47de-88c1-80e4f3c60266", 00:16:18.157 "assigned_rate_limits": { 00:16:18.157 "rw_ios_per_sec": 0, 00:16:18.157 "rw_mbytes_per_sec": 0, 00:16:18.158 "r_mbytes_per_sec": 0, 00:16:18.158 "w_mbytes_per_sec": 0 00:16:18.158 }, 00:16:18.158 "claimed": true, 00:16:18.158 "claim_type": "exclusive_write", 00:16:18.158 "zoned": false, 00:16:18.158 "supported_io_types": { 00:16:18.158 "read": true, 00:16:18.158 "write": true, 00:16:18.158 "unmap": true, 00:16:18.158 "flush": true, 00:16:18.158 "reset": true, 00:16:18.158 "nvme_admin": false, 00:16:18.158 "nvme_io": false, 00:16:18.158 "nvme_io_md": false, 00:16:18.158 "write_zeroes": true, 00:16:18.158 "zcopy": true, 00:16:18.158 "get_zone_info": false, 00:16:18.158 "zone_management": false, 00:16:18.158 "zone_append": false, 00:16:18.158 "compare": false, 00:16:18.158 "compare_and_write": false, 00:16:18.158 "abort": true, 00:16:18.158 "seek_hole": false, 00:16:18.158 "seek_data": false, 00:16:18.158 "copy": true, 00:16:18.158 "nvme_iov_md": false 00:16:18.158 }, 00:16:18.158 "memory_domains": [ 00:16:18.158 { 00:16:18.158 "dma_device_id": "system", 00:16:18.158 "dma_device_type": 1 00:16:18.158 }, 00:16:18.158 { 00:16:18.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:18.158 "dma_device_type": 2 00:16:18.158 } 00:16:18.158 ], 00:16:18.158 "driver_specific": {} 00:16:18.158 }' 00:16:18.158 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.417 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:18.417 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:18.417 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.417 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:18.417 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:18.417 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.417 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:18.417 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:18.417 11:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.676 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:18.676 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:18.676 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:18.935 [2024-07-15 11:57:32.303591] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:18.935 [2024-07-15 11:57:32.303615] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:18.935 [2024-07-15 11:57:32.303655] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.935 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.194 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.194 "name": "Existed_Raid", 00:16:19.194 "uuid": "25704efc-3b9d-49af-8e73-d017d1e4408f", 00:16:19.194 "strip_size_kb": 64, 00:16:19.194 "state": "offline", 00:16:19.194 "raid_level": "concat", 00:16:19.194 "superblock": false, 00:16:19.194 "num_base_bdevs": 3, 00:16:19.194 "num_base_bdevs_discovered": 2, 00:16:19.194 "num_base_bdevs_operational": 2, 00:16:19.194 "base_bdevs_list": [ 00:16:19.194 { 00:16:19.194 "name": null, 00:16:19.194 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.194 "is_configured": false, 00:16:19.194 "data_offset": 0, 00:16:19.194 "data_size": 65536 00:16:19.194 }, 00:16:19.194 { 00:16:19.194 "name": "BaseBdev2", 00:16:19.194 "uuid": "3a2e5675-29ee-44ab-8d4f-46d6444d0a63", 00:16:19.194 "is_configured": true, 00:16:19.194 "data_offset": 0, 00:16:19.194 "data_size": 65536 00:16:19.194 }, 00:16:19.194 { 00:16:19.194 "name": "BaseBdev3", 00:16:19.194 "uuid": "abdf2104-8002-47de-88c1-80e4f3c60266", 00:16:19.194 "is_configured": true, 00:16:19.194 "data_offset": 0, 00:16:19.194 "data_size": 65536 00:16:19.194 } 00:16:19.194 ] 00:16:19.194 }' 00:16:19.194 11:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.194 11:57:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.762 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:19.762 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:19.762 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.762 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:20.021 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:20.021 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:20.021 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:20.280 [2024-07-15 11:57:33.664193] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:20.280 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:20.280 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:20.280 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.280 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:20.539 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:20.539 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:20.539 11:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:20.798 [2024-07-15 11:57:34.165804] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:20.798 [2024-07-15 11:57:34.165846] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc7a480 name Existed_Raid, state offline 00:16:20.798 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:20.798 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:20.798 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.798 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:21.057 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:21.057 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:21.057 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:21.057 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:21.057 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:21.057 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:21.316 BaseBdev2 00:16:21.316 11:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:21.316 11:57:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:21.316 11:57:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:21.316 11:57:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:21.316 11:57:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:21.316 11:57:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:21.316 11:57:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:21.574 11:57:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:21.574 [ 00:16:21.574 { 00:16:21.574 "name": "BaseBdev2", 00:16:21.574 "aliases": [ 00:16:21.574 "66a626b3-8b37-4591-b5f4-f9db89e8e0e2" 00:16:21.574 ], 00:16:21.574 "product_name": "Malloc disk", 00:16:21.574 "block_size": 512, 00:16:21.574 "num_blocks": 65536, 00:16:21.574 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:21.574 "assigned_rate_limits": { 00:16:21.574 "rw_ios_per_sec": 0, 00:16:21.574 "rw_mbytes_per_sec": 0, 00:16:21.574 "r_mbytes_per_sec": 0, 00:16:21.574 "w_mbytes_per_sec": 0 00:16:21.574 }, 00:16:21.574 "claimed": false, 00:16:21.574 "zoned": false, 00:16:21.574 "supported_io_types": { 00:16:21.574 "read": true, 00:16:21.574 "write": true, 00:16:21.574 "unmap": true, 00:16:21.574 "flush": true, 00:16:21.574 "reset": true, 00:16:21.574 "nvme_admin": false, 00:16:21.574 "nvme_io": false, 00:16:21.574 "nvme_io_md": false, 00:16:21.574 "write_zeroes": true, 00:16:21.574 "zcopy": true, 00:16:21.574 "get_zone_info": false, 00:16:21.574 "zone_management": false, 00:16:21.574 "zone_append": false, 00:16:21.574 "compare": false, 00:16:21.574 "compare_and_write": false, 00:16:21.575 "abort": true, 00:16:21.575 "seek_hole": false, 00:16:21.575 "seek_data": false, 00:16:21.575 "copy": true, 00:16:21.575 "nvme_iov_md": false 00:16:21.575 }, 00:16:21.575 "memory_domains": [ 00:16:21.575 { 00:16:21.575 "dma_device_id": "system", 00:16:21.575 "dma_device_type": 1 00:16:21.575 }, 00:16:21.575 { 00:16:21.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:21.575 "dma_device_type": 2 00:16:21.575 } 00:16:21.575 ], 00:16:21.575 "driver_specific": {} 00:16:21.575 } 00:16:21.575 ] 00:16:21.575 11:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:21.575 11:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:21.575 11:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:21.575 11:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:21.833 BaseBdev3 00:16:21.833 11:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:21.833 11:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:21.833 11:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:21.833 11:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:21.833 11:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:21.833 11:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:21.833 11:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:22.092 11:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:22.352 [ 00:16:22.352 { 00:16:22.352 "name": "BaseBdev3", 00:16:22.352 "aliases": [ 00:16:22.352 "861f7d2a-2c6b-4c7f-9a53-931cec7bb435" 00:16:22.352 ], 00:16:22.352 "product_name": "Malloc disk", 00:16:22.352 "block_size": 512, 00:16:22.352 "num_blocks": 65536, 00:16:22.352 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:22.352 "assigned_rate_limits": { 00:16:22.352 "rw_ios_per_sec": 0, 00:16:22.352 "rw_mbytes_per_sec": 0, 00:16:22.352 "r_mbytes_per_sec": 0, 00:16:22.352 "w_mbytes_per_sec": 0 00:16:22.352 }, 00:16:22.352 "claimed": false, 00:16:22.352 "zoned": false, 00:16:22.352 "supported_io_types": { 00:16:22.352 "read": true, 00:16:22.352 "write": true, 00:16:22.352 "unmap": true, 00:16:22.352 "flush": true, 00:16:22.352 "reset": true, 00:16:22.352 "nvme_admin": false, 00:16:22.352 "nvme_io": false, 00:16:22.352 "nvme_io_md": false, 00:16:22.352 "write_zeroes": true, 00:16:22.352 "zcopy": true, 00:16:22.352 "get_zone_info": false, 00:16:22.352 "zone_management": false, 00:16:22.352 "zone_append": false, 00:16:22.352 "compare": false, 00:16:22.352 "compare_and_write": false, 00:16:22.352 "abort": true, 00:16:22.352 "seek_hole": false, 00:16:22.352 "seek_data": false, 00:16:22.352 "copy": true, 00:16:22.352 "nvme_iov_md": false 00:16:22.352 }, 00:16:22.352 "memory_domains": [ 00:16:22.352 { 00:16:22.352 "dma_device_id": "system", 00:16:22.352 "dma_device_type": 1 00:16:22.352 }, 00:16:22.352 { 00:16:22.352 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:22.352 "dma_device_type": 2 00:16:22.352 } 00:16:22.352 ], 00:16:22.352 "driver_specific": {} 00:16:22.352 } 00:16:22.352 ] 00:16:22.352 11:57:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:22.352 11:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:22.352 11:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:22.352 11:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:22.612 [2024-07-15 11:57:36.103400] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:22.612 [2024-07-15 11:57:36.103439] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:22.612 [2024-07-15 11:57:36.103458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:22.612 [2024-07-15 11:57:36.104944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:22.612 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:22.871 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.871 "name": "Existed_Raid", 00:16:22.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.871 "strip_size_kb": 64, 00:16:22.871 "state": "configuring", 00:16:22.871 "raid_level": "concat", 00:16:22.871 "superblock": false, 00:16:22.871 "num_base_bdevs": 3, 00:16:22.871 "num_base_bdevs_discovered": 2, 00:16:22.871 "num_base_bdevs_operational": 3, 00:16:22.871 "base_bdevs_list": [ 00:16:22.871 { 00:16:22.871 "name": "BaseBdev1", 00:16:22.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.871 "is_configured": false, 00:16:22.871 "data_offset": 0, 00:16:22.871 "data_size": 0 00:16:22.871 }, 00:16:22.871 { 00:16:22.871 "name": "BaseBdev2", 00:16:22.872 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:22.872 "is_configured": true, 00:16:22.872 "data_offset": 0, 00:16:22.872 "data_size": 65536 00:16:22.872 }, 00:16:22.872 { 00:16:22.872 "name": "BaseBdev3", 00:16:22.872 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:22.872 "is_configured": true, 00:16:22.872 "data_offset": 0, 00:16:22.872 "data_size": 65536 00:16:22.872 } 00:16:22.872 ] 00:16:22.872 }' 00:16:22.872 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.872 11:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.440 11:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:23.699 [2024-07-15 11:57:37.186349] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.699 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.267 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:24.267 "name": "Existed_Raid", 00:16:24.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:24.267 "strip_size_kb": 64, 00:16:24.267 "state": "configuring", 00:16:24.267 "raid_level": "concat", 00:16:24.267 "superblock": false, 00:16:24.267 "num_base_bdevs": 3, 00:16:24.267 "num_base_bdevs_discovered": 1, 00:16:24.267 "num_base_bdevs_operational": 3, 00:16:24.267 "base_bdevs_list": [ 00:16:24.267 { 00:16:24.267 "name": "BaseBdev1", 00:16:24.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:24.267 "is_configured": false, 00:16:24.267 "data_offset": 0, 00:16:24.267 "data_size": 0 00:16:24.267 }, 00:16:24.267 { 00:16:24.267 "name": null, 00:16:24.267 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:24.267 "is_configured": false, 00:16:24.267 "data_offset": 0, 00:16:24.267 "data_size": 65536 00:16:24.267 }, 00:16:24.267 { 00:16:24.267 "name": "BaseBdev3", 00:16:24.267 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:24.267 "is_configured": true, 00:16:24.267 "data_offset": 0, 00:16:24.267 "data_size": 65536 00:16:24.267 } 00:16:24.267 ] 00:16:24.267 }' 00:16:24.267 11:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:24.267 11:57:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:24.835 11:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.835 11:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:25.094 11:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:25.094 11:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:25.352 [2024-07-15 11:57:38.807263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:25.352 BaseBdev1 00:16:25.352 11:57:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:25.352 11:57:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:25.352 11:57:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:25.353 11:57:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:25.353 11:57:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:25.353 11:57:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:25.353 11:57:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:25.611 11:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:25.871 [ 00:16:25.871 { 00:16:25.871 "name": "BaseBdev1", 00:16:25.871 "aliases": [ 00:16:25.871 "fef94366-51f2-4ed4-9425-b6e24dd73348" 00:16:25.871 ], 00:16:25.871 "product_name": "Malloc disk", 00:16:25.871 "block_size": 512, 00:16:25.871 "num_blocks": 65536, 00:16:25.871 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:25.871 "assigned_rate_limits": { 00:16:25.871 "rw_ios_per_sec": 0, 00:16:25.871 "rw_mbytes_per_sec": 0, 00:16:25.871 "r_mbytes_per_sec": 0, 00:16:25.871 "w_mbytes_per_sec": 0 00:16:25.871 }, 00:16:25.871 "claimed": true, 00:16:25.871 "claim_type": "exclusive_write", 00:16:25.871 "zoned": false, 00:16:25.871 "supported_io_types": { 00:16:25.871 "read": true, 00:16:25.871 "write": true, 00:16:25.871 "unmap": true, 00:16:25.871 "flush": true, 00:16:25.871 "reset": true, 00:16:25.871 "nvme_admin": false, 00:16:25.871 "nvme_io": false, 00:16:25.871 "nvme_io_md": false, 00:16:25.871 "write_zeroes": true, 00:16:25.871 "zcopy": true, 00:16:25.871 "get_zone_info": false, 00:16:25.871 "zone_management": false, 00:16:25.871 "zone_append": false, 00:16:25.871 "compare": false, 00:16:25.871 "compare_and_write": false, 00:16:25.871 "abort": true, 00:16:25.871 "seek_hole": false, 00:16:25.871 "seek_data": false, 00:16:25.871 "copy": true, 00:16:25.871 "nvme_iov_md": false 00:16:25.871 }, 00:16:25.871 "memory_domains": [ 00:16:25.871 { 00:16:25.871 "dma_device_id": "system", 00:16:25.871 "dma_device_type": 1 00:16:25.871 }, 00:16:25.871 { 00:16:25.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.871 "dma_device_type": 2 00:16:25.871 } 00:16:25.871 ], 00:16:25.871 "driver_specific": {} 00:16:25.871 } 00:16:25.871 ] 00:16:25.871 11:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:25.871 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:25.871 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:25.872 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:26.439 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.439 "name": "Existed_Raid", 00:16:26.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.439 "strip_size_kb": 64, 00:16:26.439 "state": "configuring", 00:16:26.439 "raid_level": "concat", 00:16:26.439 "superblock": false, 00:16:26.439 "num_base_bdevs": 3, 00:16:26.439 "num_base_bdevs_discovered": 2, 00:16:26.439 "num_base_bdevs_operational": 3, 00:16:26.439 "base_bdevs_list": [ 00:16:26.439 { 00:16:26.439 "name": "BaseBdev1", 00:16:26.439 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:26.439 "is_configured": true, 00:16:26.439 "data_offset": 0, 00:16:26.439 "data_size": 65536 00:16:26.439 }, 00:16:26.439 { 00:16:26.439 "name": null, 00:16:26.439 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:26.439 "is_configured": false, 00:16:26.439 "data_offset": 0, 00:16:26.439 "data_size": 65536 00:16:26.439 }, 00:16:26.439 { 00:16:26.439 "name": "BaseBdev3", 00:16:26.439 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:26.439 "is_configured": true, 00:16:26.439 "data_offset": 0, 00:16:26.439 "data_size": 65536 00:16:26.439 } 00:16:26.439 ] 00:16:26.439 }' 00:16:26.439 11:57:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.439 11:57:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.376 11:57:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.376 11:57:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:27.634 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:27.634 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:27.893 [2024-07-15 11:57:41.285870] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.893 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:28.152 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:28.152 "name": "Existed_Raid", 00:16:28.152 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:28.152 "strip_size_kb": 64, 00:16:28.152 "state": "configuring", 00:16:28.152 "raid_level": "concat", 00:16:28.152 "superblock": false, 00:16:28.152 "num_base_bdevs": 3, 00:16:28.152 "num_base_bdevs_discovered": 1, 00:16:28.152 "num_base_bdevs_operational": 3, 00:16:28.152 "base_bdevs_list": [ 00:16:28.152 { 00:16:28.152 "name": "BaseBdev1", 00:16:28.152 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:28.152 "is_configured": true, 00:16:28.152 "data_offset": 0, 00:16:28.152 "data_size": 65536 00:16:28.152 }, 00:16:28.152 { 00:16:28.152 "name": null, 00:16:28.152 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:28.152 "is_configured": false, 00:16:28.152 "data_offset": 0, 00:16:28.152 "data_size": 65536 00:16:28.152 }, 00:16:28.152 { 00:16:28.152 "name": null, 00:16:28.152 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:28.152 "is_configured": false, 00:16:28.152 "data_offset": 0, 00:16:28.152 "data_size": 65536 00:16:28.152 } 00:16:28.152 ] 00:16:28.152 }' 00:16:28.152 11:57:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:28.152 11:57:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.095 11:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.095 11:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:29.095 11:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:29.095 11:57:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:29.661 [2024-07-15 11:57:43.162862] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.661 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.919 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.919 "name": "Existed_Raid", 00:16:29.919 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.919 "strip_size_kb": 64, 00:16:29.919 "state": "configuring", 00:16:29.919 "raid_level": "concat", 00:16:29.919 "superblock": false, 00:16:29.919 "num_base_bdevs": 3, 00:16:29.920 "num_base_bdevs_discovered": 2, 00:16:29.920 "num_base_bdevs_operational": 3, 00:16:29.920 "base_bdevs_list": [ 00:16:29.920 { 00:16:29.920 "name": "BaseBdev1", 00:16:29.920 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:29.920 "is_configured": true, 00:16:29.920 "data_offset": 0, 00:16:29.920 "data_size": 65536 00:16:29.920 }, 00:16:29.920 { 00:16:29.920 "name": null, 00:16:29.920 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:29.920 "is_configured": false, 00:16:29.920 "data_offset": 0, 00:16:29.920 "data_size": 65536 00:16:29.920 }, 00:16:29.920 { 00:16:29.920 "name": "BaseBdev3", 00:16:29.920 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:29.920 "is_configured": true, 00:16:29.920 "data_offset": 0, 00:16:29.920 "data_size": 65536 00:16:29.920 } 00:16:29.920 ] 00:16:29.920 }' 00:16:29.920 11:57:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.920 11:57:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:30.855 11:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.855 11:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:31.113 11:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:31.113 11:57:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:31.680 [2024-07-15 11:57:45.035859] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.680 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.939 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.939 "name": "Existed_Raid", 00:16:31.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.939 "strip_size_kb": 64, 00:16:31.939 "state": "configuring", 00:16:31.939 "raid_level": "concat", 00:16:31.939 "superblock": false, 00:16:31.939 "num_base_bdevs": 3, 00:16:31.939 "num_base_bdevs_discovered": 1, 00:16:31.939 "num_base_bdevs_operational": 3, 00:16:31.939 "base_bdevs_list": [ 00:16:31.939 { 00:16:31.939 "name": null, 00:16:31.939 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:31.939 "is_configured": false, 00:16:31.939 "data_offset": 0, 00:16:31.939 "data_size": 65536 00:16:31.939 }, 00:16:31.939 { 00:16:31.939 "name": null, 00:16:31.939 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:31.939 "is_configured": false, 00:16:31.939 "data_offset": 0, 00:16:31.939 "data_size": 65536 00:16:31.939 }, 00:16:31.939 { 00:16:31.939 "name": "BaseBdev3", 00:16:31.939 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:31.939 "is_configured": true, 00:16:31.939 "data_offset": 0, 00:16:31.939 "data_size": 65536 00:16:31.939 } 00:16:31.939 ] 00:16:31.939 }' 00:16:31.939 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.939 11:57:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:32.508 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:32.508 11:57:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:32.767 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:32.767 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:33.026 [2024-07-15 11:57:46.395765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.026 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.285 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.285 "name": "Existed_Raid", 00:16:33.285 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.285 "strip_size_kb": 64, 00:16:33.285 "state": "configuring", 00:16:33.285 "raid_level": "concat", 00:16:33.285 "superblock": false, 00:16:33.285 "num_base_bdevs": 3, 00:16:33.285 "num_base_bdevs_discovered": 2, 00:16:33.285 "num_base_bdevs_operational": 3, 00:16:33.285 "base_bdevs_list": [ 00:16:33.285 { 00:16:33.285 "name": null, 00:16:33.285 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:33.285 "is_configured": false, 00:16:33.285 "data_offset": 0, 00:16:33.285 "data_size": 65536 00:16:33.285 }, 00:16:33.285 { 00:16:33.285 "name": "BaseBdev2", 00:16:33.285 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:33.285 "is_configured": true, 00:16:33.285 "data_offset": 0, 00:16:33.285 "data_size": 65536 00:16:33.285 }, 00:16:33.285 { 00:16:33.285 "name": "BaseBdev3", 00:16:33.285 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:33.285 "is_configured": true, 00:16:33.285 "data_offset": 0, 00:16:33.285 "data_size": 65536 00:16:33.285 } 00:16:33.285 ] 00:16:33.285 }' 00:16:33.286 11:57:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.286 11:57:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:33.854 11:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.854 11:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:33.854 11:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:33.854 11:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.854 11:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:34.113 11:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u fef94366-51f2-4ed4-9425-b6e24dd73348 00:16:34.373 [2024-07-15 11:57:47.916477] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:34.373 [2024-07-15 11:57:47.916518] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc7ccb0 00:16:34.373 [2024-07-15 11:57:47.916526] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:16:34.373 [2024-07-15 11:57:47.916729] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb94420 00:16:34.373 [2024-07-15 11:57:47.916847] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc7ccb0 00:16:34.373 [2024-07-15 11:57:47.916857] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc7ccb0 00:16:34.373 [2024-07-15 11:57:47.917019] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:34.373 NewBaseBdev 00:16:34.373 11:57:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:34.373 11:57:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:16:34.373 11:57:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:34.373 11:57:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:16:34.373 11:57:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:34.373 11:57:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:34.373 11:57:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:34.632 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:34.892 [ 00:16:34.892 { 00:16:34.892 "name": "NewBaseBdev", 00:16:34.892 "aliases": [ 00:16:34.892 "fef94366-51f2-4ed4-9425-b6e24dd73348" 00:16:34.892 ], 00:16:34.892 "product_name": "Malloc disk", 00:16:34.892 "block_size": 512, 00:16:34.892 "num_blocks": 65536, 00:16:34.892 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:34.892 "assigned_rate_limits": { 00:16:34.892 "rw_ios_per_sec": 0, 00:16:34.892 "rw_mbytes_per_sec": 0, 00:16:34.892 "r_mbytes_per_sec": 0, 00:16:34.892 "w_mbytes_per_sec": 0 00:16:34.892 }, 00:16:34.892 "claimed": true, 00:16:34.892 "claim_type": "exclusive_write", 00:16:34.892 "zoned": false, 00:16:34.892 "supported_io_types": { 00:16:34.892 "read": true, 00:16:34.892 "write": true, 00:16:34.892 "unmap": true, 00:16:34.892 "flush": true, 00:16:34.892 "reset": true, 00:16:34.892 "nvme_admin": false, 00:16:34.892 "nvme_io": false, 00:16:34.892 "nvme_io_md": false, 00:16:34.892 "write_zeroes": true, 00:16:34.892 "zcopy": true, 00:16:34.892 "get_zone_info": false, 00:16:34.892 "zone_management": false, 00:16:34.892 "zone_append": false, 00:16:34.892 "compare": false, 00:16:34.892 "compare_and_write": false, 00:16:34.892 "abort": true, 00:16:34.892 "seek_hole": false, 00:16:34.892 "seek_data": false, 00:16:34.892 "copy": true, 00:16:34.892 "nvme_iov_md": false 00:16:34.892 }, 00:16:34.892 "memory_domains": [ 00:16:34.892 { 00:16:34.892 "dma_device_id": "system", 00:16:34.892 "dma_device_type": 1 00:16:34.892 }, 00:16:34.892 { 00:16:34.892 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.892 "dma_device_type": 2 00:16:34.892 } 00:16:34.892 ], 00:16:34.892 "driver_specific": {} 00:16:34.892 } 00:16:34.892 ] 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.892 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:35.152 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:35.152 "name": "Existed_Raid", 00:16:35.152 "uuid": "424ad51c-b69d-4b1f-8050-223ddfae969b", 00:16:35.152 "strip_size_kb": 64, 00:16:35.152 "state": "online", 00:16:35.152 "raid_level": "concat", 00:16:35.152 "superblock": false, 00:16:35.152 "num_base_bdevs": 3, 00:16:35.152 "num_base_bdevs_discovered": 3, 00:16:35.152 "num_base_bdevs_operational": 3, 00:16:35.152 "base_bdevs_list": [ 00:16:35.152 { 00:16:35.152 "name": "NewBaseBdev", 00:16:35.152 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:35.152 "is_configured": true, 00:16:35.152 "data_offset": 0, 00:16:35.152 "data_size": 65536 00:16:35.152 }, 00:16:35.152 { 00:16:35.152 "name": "BaseBdev2", 00:16:35.152 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:35.152 "is_configured": true, 00:16:35.152 "data_offset": 0, 00:16:35.152 "data_size": 65536 00:16:35.152 }, 00:16:35.152 { 00:16:35.152 "name": "BaseBdev3", 00:16:35.152 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:35.152 "is_configured": true, 00:16:35.152 "data_offset": 0, 00:16:35.152 "data_size": 65536 00:16:35.152 } 00:16:35.152 ] 00:16:35.152 }' 00:16:35.152 11:57:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:35.152 11:57:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.722 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:35.722 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:35.722 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:35.722 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:35.722 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:35.722 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:35.723 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:35.723 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:35.982 [2024-07-15 11:57:49.440813] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:35.982 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:35.982 "name": "Existed_Raid", 00:16:35.982 "aliases": [ 00:16:35.982 "424ad51c-b69d-4b1f-8050-223ddfae969b" 00:16:35.982 ], 00:16:35.982 "product_name": "Raid Volume", 00:16:35.982 "block_size": 512, 00:16:35.982 "num_blocks": 196608, 00:16:35.982 "uuid": "424ad51c-b69d-4b1f-8050-223ddfae969b", 00:16:35.982 "assigned_rate_limits": { 00:16:35.982 "rw_ios_per_sec": 0, 00:16:35.982 "rw_mbytes_per_sec": 0, 00:16:35.982 "r_mbytes_per_sec": 0, 00:16:35.982 "w_mbytes_per_sec": 0 00:16:35.982 }, 00:16:35.982 "claimed": false, 00:16:35.982 "zoned": false, 00:16:35.982 "supported_io_types": { 00:16:35.982 "read": true, 00:16:35.982 "write": true, 00:16:35.982 "unmap": true, 00:16:35.982 "flush": true, 00:16:35.982 "reset": true, 00:16:35.982 "nvme_admin": false, 00:16:35.982 "nvme_io": false, 00:16:35.982 "nvme_io_md": false, 00:16:35.982 "write_zeroes": true, 00:16:35.982 "zcopy": false, 00:16:35.982 "get_zone_info": false, 00:16:35.982 "zone_management": false, 00:16:35.982 "zone_append": false, 00:16:35.982 "compare": false, 00:16:35.982 "compare_and_write": false, 00:16:35.982 "abort": false, 00:16:35.982 "seek_hole": false, 00:16:35.982 "seek_data": false, 00:16:35.982 "copy": false, 00:16:35.982 "nvme_iov_md": false 00:16:35.982 }, 00:16:35.982 "memory_domains": [ 00:16:35.982 { 00:16:35.982 "dma_device_id": "system", 00:16:35.982 "dma_device_type": 1 00:16:35.982 }, 00:16:35.982 { 00:16:35.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.982 "dma_device_type": 2 00:16:35.982 }, 00:16:35.982 { 00:16:35.982 "dma_device_id": "system", 00:16:35.982 "dma_device_type": 1 00:16:35.982 }, 00:16:35.982 { 00:16:35.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.982 "dma_device_type": 2 00:16:35.982 }, 00:16:35.982 { 00:16:35.982 "dma_device_id": "system", 00:16:35.982 "dma_device_type": 1 00:16:35.982 }, 00:16:35.982 { 00:16:35.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.982 "dma_device_type": 2 00:16:35.982 } 00:16:35.982 ], 00:16:35.982 "driver_specific": { 00:16:35.982 "raid": { 00:16:35.982 "uuid": "424ad51c-b69d-4b1f-8050-223ddfae969b", 00:16:35.982 "strip_size_kb": 64, 00:16:35.982 "state": "online", 00:16:35.982 "raid_level": "concat", 00:16:35.982 "superblock": false, 00:16:35.982 "num_base_bdevs": 3, 00:16:35.982 "num_base_bdevs_discovered": 3, 00:16:35.982 "num_base_bdevs_operational": 3, 00:16:35.982 "base_bdevs_list": [ 00:16:35.982 { 00:16:35.982 "name": "NewBaseBdev", 00:16:35.982 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:35.982 "is_configured": true, 00:16:35.982 "data_offset": 0, 00:16:35.982 "data_size": 65536 00:16:35.982 }, 00:16:35.982 { 00:16:35.982 "name": "BaseBdev2", 00:16:35.982 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:35.982 "is_configured": true, 00:16:35.982 "data_offset": 0, 00:16:35.982 "data_size": 65536 00:16:35.982 }, 00:16:35.982 { 00:16:35.982 "name": "BaseBdev3", 00:16:35.982 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:35.982 "is_configured": true, 00:16:35.982 "data_offset": 0, 00:16:35.982 "data_size": 65536 00:16:35.982 } 00:16:35.982 ] 00:16:35.982 } 00:16:35.982 } 00:16:35.982 }' 00:16:35.982 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:35.982 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:35.982 BaseBdev2 00:16:35.982 BaseBdev3' 00:16:35.982 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:35.982 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:35.982 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.242 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.242 "name": "NewBaseBdev", 00:16:36.242 "aliases": [ 00:16:36.242 "fef94366-51f2-4ed4-9425-b6e24dd73348" 00:16:36.242 ], 00:16:36.242 "product_name": "Malloc disk", 00:16:36.242 "block_size": 512, 00:16:36.242 "num_blocks": 65536, 00:16:36.242 "uuid": "fef94366-51f2-4ed4-9425-b6e24dd73348", 00:16:36.242 "assigned_rate_limits": { 00:16:36.242 "rw_ios_per_sec": 0, 00:16:36.242 "rw_mbytes_per_sec": 0, 00:16:36.242 "r_mbytes_per_sec": 0, 00:16:36.242 "w_mbytes_per_sec": 0 00:16:36.242 }, 00:16:36.242 "claimed": true, 00:16:36.242 "claim_type": "exclusive_write", 00:16:36.242 "zoned": false, 00:16:36.242 "supported_io_types": { 00:16:36.242 "read": true, 00:16:36.242 "write": true, 00:16:36.242 "unmap": true, 00:16:36.242 "flush": true, 00:16:36.242 "reset": true, 00:16:36.242 "nvme_admin": false, 00:16:36.242 "nvme_io": false, 00:16:36.242 "nvme_io_md": false, 00:16:36.242 "write_zeroes": true, 00:16:36.242 "zcopy": true, 00:16:36.242 "get_zone_info": false, 00:16:36.242 "zone_management": false, 00:16:36.242 "zone_append": false, 00:16:36.242 "compare": false, 00:16:36.242 "compare_and_write": false, 00:16:36.242 "abort": true, 00:16:36.242 "seek_hole": false, 00:16:36.242 "seek_data": false, 00:16:36.242 "copy": true, 00:16:36.242 "nvme_iov_md": false 00:16:36.242 }, 00:16:36.242 "memory_domains": [ 00:16:36.242 { 00:16:36.242 "dma_device_id": "system", 00:16:36.242 "dma_device_type": 1 00:16:36.242 }, 00:16:36.242 { 00:16:36.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.242 "dma_device_type": 2 00:16:36.242 } 00:16:36.242 ], 00:16:36.242 "driver_specific": {} 00:16:36.242 }' 00:16:36.242 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.242 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.502 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:36.502 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.502 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.502 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:36.502 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.502 11:57:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.502 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:36.502 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.502 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.760 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:36.760 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:36.760 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.760 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:37.018 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:37.018 "name": "BaseBdev2", 00:16:37.018 "aliases": [ 00:16:37.018 "66a626b3-8b37-4591-b5f4-f9db89e8e0e2" 00:16:37.018 ], 00:16:37.018 "product_name": "Malloc disk", 00:16:37.018 "block_size": 512, 00:16:37.018 "num_blocks": 65536, 00:16:37.018 "uuid": "66a626b3-8b37-4591-b5f4-f9db89e8e0e2", 00:16:37.018 "assigned_rate_limits": { 00:16:37.018 "rw_ios_per_sec": 0, 00:16:37.018 "rw_mbytes_per_sec": 0, 00:16:37.018 "r_mbytes_per_sec": 0, 00:16:37.018 "w_mbytes_per_sec": 0 00:16:37.018 }, 00:16:37.018 "claimed": true, 00:16:37.018 "claim_type": "exclusive_write", 00:16:37.018 "zoned": false, 00:16:37.018 "supported_io_types": { 00:16:37.018 "read": true, 00:16:37.018 "write": true, 00:16:37.018 "unmap": true, 00:16:37.018 "flush": true, 00:16:37.018 "reset": true, 00:16:37.018 "nvme_admin": false, 00:16:37.018 "nvme_io": false, 00:16:37.018 "nvme_io_md": false, 00:16:37.018 "write_zeroes": true, 00:16:37.018 "zcopy": true, 00:16:37.018 "get_zone_info": false, 00:16:37.018 "zone_management": false, 00:16:37.018 "zone_append": false, 00:16:37.018 "compare": false, 00:16:37.018 "compare_and_write": false, 00:16:37.018 "abort": true, 00:16:37.018 "seek_hole": false, 00:16:37.018 "seek_data": false, 00:16:37.018 "copy": true, 00:16:37.018 "nvme_iov_md": false 00:16:37.018 }, 00:16:37.018 "memory_domains": [ 00:16:37.018 { 00:16:37.018 "dma_device_id": "system", 00:16:37.018 "dma_device_type": 1 00:16:37.018 }, 00:16:37.018 { 00:16:37.018 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.018 "dma_device_type": 2 00:16:37.018 } 00:16:37.018 ], 00:16:37.018 "driver_specific": {} 00:16:37.018 }' 00:16:37.018 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.018 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.018 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.018 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.018 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.018 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.018 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.018 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.277 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.277 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.277 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.277 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.277 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:37.277 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:37.277 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:37.536 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:37.536 "name": "BaseBdev3", 00:16:37.536 "aliases": [ 00:16:37.536 "861f7d2a-2c6b-4c7f-9a53-931cec7bb435" 00:16:37.536 ], 00:16:37.536 "product_name": "Malloc disk", 00:16:37.536 "block_size": 512, 00:16:37.536 "num_blocks": 65536, 00:16:37.536 "uuid": "861f7d2a-2c6b-4c7f-9a53-931cec7bb435", 00:16:37.536 "assigned_rate_limits": { 00:16:37.536 "rw_ios_per_sec": 0, 00:16:37.536 "rw_mbytes_per_sec": 0, 00:16:37.536 "r_mbytes_per_sec": 0, 00:16:37.536 "w_mbytes_per_sec": 0 00:16:37.536 }, 00:16:37.536 "claimed": true, 00:16:37.536 "claim_type": "exclusive_write", 00:16:37.536 "zoned": false, 00:16:37.536 "supported_io_types": { 00:16:37.536 "read": true, 00:16:37.536 "write": true, 00:16:37.536 "unmap": true, 00:16:37.536 "flush": true, 00:16:37.536 "reset": true, 00:16:37.536 "nvme_admin": false, 00:16:37.536 "nvme_io": false, 00:16:37.536 "nvme_io_md": false, 00:16:37.536 "write_zeroes": true, 00:16:37.536 "zcopy": true, 00:16:37.536 "get_zone_info": false, 00:16:37.536 "zone_management": false, 00:16:37.536 "zone_append": false, 00:16:37.536 "compare": false, 00:16:37.536 "compare_and_write": false, 00:16:37.536 "abort": true, 00:16:37.536 "seek_hole": false, 00:16:37.536 "seek_data": false, 00:16:37.536 "copy": true, 00:16:37.536 "nvme_iov_md": false 00:16:37.536 }, 00:16:37.536 "memory_domains": [ 00:16:37.536 { 00:16:37.536 "dma_device_id": "system", 00:16:37.536 "dma_device_type": 1 00:16:37.536 }, 00:16:37.536 { 00:16:37.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.536 "dma_device_type": 2 00:16:37.536 } 00:16:37.536 ], 00:16:37.536 "driver_specific": {} 00:16:37.536 }' 00:16:37.536 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.536 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.536 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.536 11:57:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.536 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.536 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.536 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.536 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.536 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.536 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.795 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.795 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.795 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:38.054 [2024-07-15 11:57:51.401759] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:38.054 [2024-07-15 11:57:51.401783] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:38.054 [2024-07-15 11:57:51.401836] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:38.054 [2024-07-15 11:57:51.401886] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:38.054 [2024-07-15 11:57:51.401898] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc7ccb0 name Existed_Raid, state offline 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1489672 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1489672 ']' 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1489672 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1489672 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1489672' 00:16:38.054 killing process with pid 1489672 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1489672 00:16:38.054 [2024-07-15 11:57:51.474148] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:38.054 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1489672 00:16:38.054 [2024-07-15 11:57:51.501569] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:38.313 11:57:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:38.313 00:16:38.313 real 0m31.134s 00:16:38.313 user 0m57.106s 00:16:38.313 sys 0m5.548s 00:16:38.313 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:16:38.313 11:57:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:38.313 ************************************ 00:16:38.313 END TEST raid_state_function_test 00:16:38.313 ************************************ 00:16:38.313 11:57:51 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:16:38.314 11:57:51 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:16:38.314 11:57:51 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:16:38.314 11:57:51 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:16:38.314 11:57:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:38.314 ************************************ 00:16:38.314 START TEST raid_state_function_test_sb 00:16:38.314 ************************************ 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 3 true 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1494310 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1494310' 00:16:38.314 Process raid pid: 1494310 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1494310 /var/tmp/spdk-raid.sock 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1494310 ']' 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:38.314 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:16:38.314 11:57:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:38.314 [2024-07-15 11:57:51.888105] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:16:38.314 [2024-07-15 11:57:51.888177] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:38.573 [2024-07-15 11:57:52.021895] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:38.573 [2024-07-15 11:57:52.126302] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:38.832 [2024-07-15 11:57:52.192604] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:38.832 [2024-07-15 11:57:52.192641] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:39.401 11:57:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:16:39.401 11:57:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:16:39.401 11:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:39.661 [2024-07-15 11:57:53.045390] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:39.661 [2024-07-15 11:57:53.045438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:39.661 [2024-07-15 11:57:53.045450] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:39.661 [2024-07-15 11:57:53.045462] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:39.661 [2024-07-15 11:57:53.045470] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:39.661 [2024-07-15 11:57:53.045481] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.661 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.920 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.920 "name": "Existed_Raid", 00:16:39.920 "uuid": "699712e1-aea4-4767-b59e-a747fa311be3", 00:16:39.920 "strip_size_kb": 64, 00:16:39.920 "state": "configuring", 00:16:39.920 "raid_level": "concat", 00:16:39.920 "superblock": true, 00:16:39.920 "num_base_bdevs": 3, 00:16:39.920 "num_base_bdevs_discovered": 0, 00:16:39.920 "num_base_bdevs_operational": 3, 00:16:39.920 "base_bdevs_list": [ 00:16:39.920 { 00:16:39.920 "name": "BaseBdev1", 00:16:39.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.920 "is_configured": false, 00:16:39.920 "data_offset": 0, 00:16:39.920 "data_size": 0 00:16:39.920 }, 00:16:39.920 { 00:16:39.920 "name": "BaseBdev2", 00:16:39.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.920 "is_configured": false, 00:16:39.920 "data_offset": 0, 00:16:39.920 "data_size": 0 00:16:39.920 }, 00:16:39.920 { 00:16:39.920 "name": "BaseBdev3", 00:16:39.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.920 "is_configured": false, 00:16:39.920 "data_offset": 0, 00:16:39.920 "data_size": 0 00:16:39.920 } 00:16:39.920 ] 00:16:39.920 }' 00:16:39.920 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.920 11:57:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:40.487 11:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:40.746 [2024-07-15 11:57:54.148144] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:40.746 [2024-07-15 11:57:54.148176] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e8ab00 name Existed_Raid, state configuring 00:16:40.746 11:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:41.005 [2024-07-15 11:57:54.400841] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:41.005 [2024-07-15 11:57:54.400872] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:41.005 [2024-07-15 11:57:54.400887] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:41.005 [2024-07-15 11:57:54.400899] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:41.005 [2024-07-15 11:57:54.400907] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:41.005 [2024-07-15 11:57:54.400919] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:41.005 11:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:41.264 [2024-07-15 11:57:54.659386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:41.264 BaseBdev1 00:16:41.264 11:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:41.264 11:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:41.264 11:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:41.264 11:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:41.264 11:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:41.264 11:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:41.264 11:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:41.524 11:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:41.783 [ 00:16:41.783 { 00:16:41.783 "name": "BaseBdev1", 00:16:41.783 "aliases": [ 00:16:41.783 "fde0cd26-ca34-43e4-92ae-b5e350496543" 00:16:41.783 ], 00:16:41.783 "product_name": "Malloc disk", 00:16:41.783 "block_size": 512, 00:16:41.783 "num_blocks": 65536, 00:16:41.783 "uuid": "fde0cd26-ca34-43e4-92ae-b5e350496543", 00:16:41.783 "assigned_rate_limits": { 00:16:41.783 "rw_ios_per_sec": 0, 00:16:41.783 "rw_mbytes_per_sec": 0, 00:16:41.783 "r_mbytes_per_sec": 0, 00:16:41.783 "w_mbytes_per_sec": 0 00:16:41.783 }, 00:16:41.783 "claimed": true, 00:16:41.783 "claim_type": "exclusive_write", 00:16:41.783 "zoned": false, 00:16:41.783 "supported_io_types": { 00:16:41.783 "read": true, 00:16:41.783 "write": true, 00:16:41.783 "unmap": true, 00:16:41.783 "flush": true, 00:16:41.783 "reset": true, 00:16:41.783 "nvme_admin": false, 00:16:41.783 "nvme_io": false, 00:16:41.783 "nvme_io_md": false, 00:16:41.783 "write_zeroes": true, 00:16:41.783 "zcopy": true, 00:16:41.783 "get_zone_info": false, 00:16:41.783 "zone_management": false, 00:16:41.783 "zone_append": false, 00:16:41.783 "compare": false, 00:16:41.783 "compare_and_write": false, 00:16:41.783 "abort": true, 00:16:41.783 "seek_hole": false, 00:16:41.783 "seek_data": false, 00:16:41.783 "copy": true, 00:16:41.783 "nvme_iov_md": false 00:16:41.783 }, 00:16:41.783 "memory_domains": [ 00:16:41.783 { 00:16:41.783 "dma_device_id": "system", 00:16:41.783 "dma_device_type": 1 00:16:41.783 }, 00:16:41.783 { 00:16:41.783 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:41.783 "dma_device_type": 2 00:16:41.783 } 00:16:41.783 ], 00:16:41.783 "driver_specific": {} 00:16:41.783 } 00:16:41.783 ] 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.783 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.042 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.042 "name": "Existed_Raid", 00:16:42.042 "uuid": "4f94177d-965d-4867-9de8-7e8372f25977", 00:16:42.042 "strip_size_kb": 64, 00:16:42.042 "state": "configuring", 00:16:42.042 "raid_level": "concat", 00:16:42.042 "superblock": true, 00:16:42.042 "num_base_bdevs": 3, 00:16:42.042 "num_base_bdevs_discovered": 1, 00:16:42.042 "num_base_bdevs_operational": 3, 00:16:42.042 "base_bdevs_list": [ 00:16:42.042 { 00:16:42.042 "name": "BaseBdev1", 00:16:42.042 "uuid": "fde0cd26-ca34-43e4-92ae-b5e350496543", 00:16:42.043 "is_configured": true, 00:16:42.043 "data_offset": 2048, 00:16:42.043 "data_size": 63488 00:16:42.043 }, 00:16:42.043 { 00:16:42.043 "name": "BaseBdev2", 00:16:42.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.043 "is_configured": false, 00:16:42.043 "data_offset": 0, 00:16:42.043 "data_size": 0 00:16:42.043 }, 00:16:42.043 { 00:16:42.043 "name": "BaseBdev3", 00:16:42.043 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.043 "is_configured": false, 00:16:42.043 "data_offset": 0, 00:16:42.043 "data_size": 0 00:16:42.043 } 00:16:42.043 ] 00:16:42.043 }' 00:16:42.043 11:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.043 11:57:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:42.611 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:42.870 [2024-07-15 11:57:56.271664] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:42.870 [2024-07-15 11:57:56.271710] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e8a390 name Existed_Raid, state configuring 00:16:42.870 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:43.128 [2024-07-15 11:57:56.520468] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:43.128 [2024-07-15 11:57:56.521902] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:43.128 [2024-07-15 11:57:56.521937] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:43.128 [2024-07-15 11:57:56.521947] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:43.128 [2024-07-15 11:57:56.521958] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.128 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.386 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.386 "name": "Existed_Raid", 00:16:43.386 "uuid": "d409b648-d32f-43e6-9c86-7aa9467cb46d", 00:16:43.386 "strip_size_kb": 64, 00:16:43.386 "state": "configuring", 00:16:43.386 "raid_level": "concat", 00:16:43.386 "superblock": true, 00:16:43.386 "num_base_bdevs": 3, 00:16:43.386 "num_base_bdevs_discovered": 1, 00:16:43.386 "num_base_bdevs_operational": 3, 00:16:43.386 "base_bdevs_list": [ 00:16:43.386 { 00:16:43.386 "name": "BaseBdev1", 00:16:43.386 "uuid": "fde0cd26-ca34-43e4-92ae-b5e350496543", 00:16:43.386 "is_configured": true, 00:16:43.386 "data_offset": 2048, 00:16:43.386 "data_size": 63488 00:16:43.386 }, 00:16:43.386 { 00:16:43.386 "name": "BaseBdev2", 00:16:43.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.386 "is_configured": false, 00:16:43.386 "data_offset": 0, 00:16:43.386 "data_size": 0 00:16:43.386 }, 00:16:43.386 { 00:16:43.386 "name": "BaseBdev3", 00:16:43.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:43.386 "is_configured": false, 00:16:43.386 "data_offset": 0, 00:16:43.386 "data_size": 0 00:16:43.386 } 00:16:43.386 ] 00:16:43.386 }' 00:16:43.386 11:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.386 11:57:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.954 11:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:44.213 [2024-07-15 11:57:57.566644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:44.213 BaseBdev2 00:16:44.213 11:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:44.213 11:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:44.213 11:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:44.213 11:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:44.213 11:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:44.213 11:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:44.213 11:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:44.509 11:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:44.509 [ 00:16:44.509 { 00:16:44.509 "name": "BaseBdev2", 00:16:44.509 "aliases": [ 00:16:44.509 "965a7f79-0bc8-4bb1-b8a8-f8b45fe82994" 00:16:44.509 ], 00:16:44.509 "product_name": "Malloc disk", 00:16:44.509 "block_size": 512, 00:16:44.509 "num_blocks": 65536, 00:16:44.509 "uuid": "965a7f79-0bc8-4bb1-b8a8-f8b45fe82994", 00:16:44.509 "assigned_rate_limits": { 00:16:44.509 "rw_ios_per_sec": 0, 00:16:44.509 "rw_mbytes_per_sec": 0, 00:16:44.509 "r_mbytes_per_sec": 0, 00:16:44.509 "w_mbytes_per_sec": 0 00:16:44.509 }, 00:16:44.509 "claimed": true, 00:16:44.509 "claim_type": "exclusive_write", 00:16:44.509 "zoned": false, 00:16:44.509 "supported_io_types": { 00:16:44.509 "read": true, 00:16:44.509 "write": true, 00:16:44.509 "unmap": true, 00:16:44.509 "flush": true, 00:16:44.509 "reset": true, 00:16:44.509 "nvme_admin": false, 00:16:44.509 "nvme_io": false, 00:16:44.509 "nvme_io_md": false, 00:16:44.509 "write_zeroes": true, 00:16:44.509 "zcopy": true, 00:16:44.509 "get_zone_info": false, 00:16:44.509 "zone_management": false, 00:16:44.509 "zone_append": false, 00:16:44.509 "compare": false, 00:16:44.509 "compare_and_write": false, 00:16:44.509 "abort": true, 00:16:44.509 "seek_hole": false, 00:16:44.509 "seek_data": false, 00:16:44.509 "copy": true, 00:16:44.509 "nvme_iov_md": false 00:16:44.509 }, 00:16:44.509 "memory_domains": [ 00:16:44.509 { 00:16:44.509 "dma_device_id": "system", 00:16:44.509 "dma_device_type": 1 00:16:44.509 }, 00:16:44.509 { 00:16:44.509 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:44.509 "dma_device_type": 2 00:16:44.509 } 00:16:44.509 ], 00:16:44.509 "driver_specific": {} 00:16:44.509 } 00:16:44.509 ] 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.509 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.825 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.825 "name": "Existed_Raid", 00:16:44.825 "uuid": "d409b648-d32f-43e6-9c86-7aa9467cb46d", 00:16:44.825 "strip_size_kb": 64, 00:16:44.825 "state": "configuring", 00:16:44.825 "raid_level": "concat", 00:16:44.825 "superblock": true, 00:16:44.825 "num_base_bdevs": 3, 00:16:44.825 "num_base_bdevs_discovered": 2, 00:16:44.825 "num_base_bdevs_operational": 3, 00:16:44.825 "base_bdevs_list": [ 00:16:44.825 { 00:16:44.825 "name": "BaseBdev1", 00:16:44.825 "uuid": "fde0cd26-ca34-43e4-92ae-b5e350496543", 00:16:44.825 "is_configured": true, 00:16:44.825 "data_offset": 2048, 00:16:44.825 "data_size": 63488 00:16:44.825 }, 00:16:44.825 { 00:16:44.825 "name": "BaseBdev2", 00:16:44.825 "uuid": "965a7f79-0bc8-4bb1-b8a8-f8b45fe82994", 00:16:44.825 "is_configured": true, 00:16:44.825 "data_offset": 2048, 00:16:44.825 "data_size": 63488 00:16:44.825 }, 00:16:44.825 { 00:16:44.825 "name": "BaseBdev3", 00:16:44.825 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.825 "is_configured": false, 00:16:44.825 "data_offset": 0, 00:16:44.825 "data_size": 0 00:16:44.825 } 00:16:44.825 ] 00:16:44.825 }' 00:16:44.825 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.825 11:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.401 11:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:45.660 [2024-07-15 11:57:59.166378] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:45.660 [2024-07-15 11:57:59.166545] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e8b480 00:16:45.660 [2024-07-15 11:57:59.166560] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:16:45.660 [2024-07-15 11:57:59.166748] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b915d0 00:16:45.660 [2024-07-15 11:57:59.166867] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e8b480 00:16:45.660 [2024-07-15 11:57:59.166877] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e8b480 00:16:45.660 [2024-07-15 11:57:59.166969] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:45.660 BaseBdev3 00:16:45.660 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:45.660 11:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:45.660 11:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:45.660 11:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:45.660 11:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:45.660 11:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:45.660 11:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:45.918 11:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:46.177 [ 00:16:46.177 { 00:16:46.177 "name": "BaseBdev3", 00:16:46.177 "aliases": [ 00:16:46.177 "c4f70158-8e3c-4215-ad25-973c79aeaf30" 00:16:46.177 ], 00:16:46.177 "product_name": "Malloc disk", 00:16:46.177 "block_size": 512, 00:16:46.177 "num_blocks": 65536, 00:16:46.177 "uuid": "c4f70158-8e3c-4215-ad25-973c79aeaf30", 00:16:46.177 "assigned_rate_limits": { 00:16:46.177 "rw_ios_per_sec": 0, 00:16:46.177 "rw_mbytes_per_sec": 0, 00:16:46.177 "r_mbytes_per_sec": 0, 00:16:46.177 "w_mbytes_per_sec": 0 00:16:46.177 }, 00:16:46.177 "claimed": true, 00:16:46.177 "claim_type": "exclusive_write", 00:16:46.177 "zoned": false, 00:16:46.177 "supported_io_types": { 00:16:46.177 "read": true, 00:16:46.177 "write": true, 00:16:46.177 "unmap": true, 00:16:46.177 "flush": true, 00:16:46.177 "reset": true, 00:16:46.177 "nvme_admin": false, 00:16:46.177 "nvme_io": false, 00:16:46.177 "nvme_io_md": false, 00:16:46.177 "write_zeroes": true, 00:16:46.177 "zcopy": true, 00:16:46.177 "get_zone_info": false, 00:16:46.177 "zone_management": false, 00:16:46.177 "zone_append": false, 00:16:46.177 "compare": false, 00:16:46.177 "compare_and_write": false, 00:16:46.177 "abort": true, 00:16:46.177 "seek_hole": false, 00:16:46.177 "seek_data": false, 00:16:46.177 "copy": true, 00:16:46.177 "nvme_iov_md": false 00:16:46.177 }, 00:16:46.177 "memory_domains": [ 00:16:46.177 { 00:16:46.177 "dma_device_id": "system", 00:16:46.177 "dma_device_type": 1 00:16:46.177 }, 00:16:46.177 { 00:16:46.177 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:46.177 "dma_device_type": 2 00:16:46.177 } 00:16:46.177 ], 00:16:46.177 "driver_specific": {} 00:16:46.177 } 00:16:46.177 ] 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.177 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.436 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.436 "name": "Existed_Raid", 00:16:46.436 "uuid": "d409b648-d32f-43e6-9c86-7aa9467cb46d", 00:16:46.436 "strip_size_kb": 64, 00:16:46.436 "state": "online", 00:16:46.436 "raid_level": "concat", 00:16:46.436 "superblock": true, 00:16:46.436 "num_base_bdevs": 3, 00:16:46.436 "num_base_bdevs_discovered": 3, 00:16:46.436 "num_base_bdevs_operational": 3, 00:16:46.436 "base_bdevs_list": [ 00:16:46.436 { 00:16:46.436 "name": "BaseBdev1", 00:16:46.436 "uuid": "fde0cd26-ca34-43e4-92ae-b5e350496543", 00:16:46.436 "is_configured": true, 00:16:46.436 "data_offset": 2048, 00:16:46.436 "data_size": 63488 00:16:46.436 }, 00:16:46.436 { 00:16:46.436 "name": "BaseBdev2", 00:16:46.436 "uuid": "965a7f79-0bc8-4bb1-b8a8-f8b45fe82994", 00:16:46.436 "is_configured": true, 00:16:46.436 "data_offset": 2048, 00:16:46.436 "data_size": 63488 00:16:46.436 }, 00:16:46.436 { 00:16:46.436 "name": "BaseBdev3", 00:16:46.436 "uuid": "c4f70158-8e3c-4215-ad25-973c79aeaf30", 00:16:46.436 "is_configured": true, 00:16:46.436 "data_offset": 2048, 00:16:46.436 "data_size": 63488 00:16:46.436 } 00:16:46.436 ] 00:16:46.436 }' 00:16:46.436 11:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.436 11:57:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:47.003 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:47.003 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:47.003 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:47.003 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:47.003 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:47.003 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:47.003 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:47.003 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:47.003 [2024-07-15 11:58:00.578442] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:47.261 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:47.261 "name": "Existed_Raid", 00:16:47.261 "aliases": [ 00:16:47.261 "d409b648-d32f-43e6-9c86-7aa9467cb46d" 00:16:47.261 ], 00:16:47.261 "product_name": "Raid Volume", 00:16:47.261 "block_size": 512, 00:16:47.261 "num_blocks": 190464, 00:16:47.261 "uuid": "d409b648-d32f-43e6-9c86-7aa9467cb46d", 00:16:47.261 "assigned_rate_limits": { 00:16:47.261 "rw_ios_per_sec": 0, 00:16:47.261 "rw_mbytes_per_sec": 0, 00:16:47.261 "r_mbytes_per_sec": 0, 00:16:47.261 "w_mbytes_per_sec": 0 00:16:47.261 }, 00:16:47.261 "claimed": false, 00:16:47.261 "zoned": false, 00:16:47.261 "supported_io_types": { 00:16:47.261 "read": true, 00:16:47.261 "write": true, 00:16:47.261 "unmap": true, 00:16:47.261 "flush": true, 00:16:47.261 "reset": true, 00:16:47.261 "nvme_admin": false, 00:16:47.261 "nvme_io": false, 00:16:47.261 "nvme_io_md": false, 00:16:47.261 "write_zeroes": true, 00:16:47.261 "zcopy": false, 00:16:47.261 "get_zone_info": false, 00:16:47.261 "zone_management": false, 00:16:47.261 "zone_append": false, 00:16:47.261 "compare": false, 00:16:47.261 "compare_and_write": false, 00:16:47.261 "abort": false, 00:16:47.262 "seek_hole": false, 00:16:47.262 "seek_data": false, 00:16:47.262 "copy": false, 00:16:47.262 "nvme_iov_md": false 00:16:47.262 }, 00:16:47.262 "memory_domains": [ 00:16:47.262 { 00:16:47.262 "dma_device_id": "system", 00:16:47.262 "dma_device_type": 1 00:16:47.262 }, 00:16:47.262 { 00:16:47.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.262 "dma_device_type": 2 00:16:47.262 }, 00:16:47.262 { 00:16:47.262 "dma_device_id": "system", 00:16:47.262 "dma_device_type": 1 00:16:47.262 }, 00:16:47.262 { 00:16:47.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.262 "dma_device_type": 2 00:16:47.262 }, 00:16:47.262 { 00:16:47.262 "dma_device_id": "system", 00:16:47.262 "dma_device_type": 1 00:16:47.262 }, 00:16:47.262 { 00:16:47.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.262 "dma_device_type": 2 00:16:47.262 } 00:16:47.262 ], 00:16:47.262 "driver_specific": { 00:16:47.262 "raid": { 00:16:47.262 "uuid": "d409b648-d32f-43e6-9c86-7aa9467cb46d", 00:16:47.262 "strip_size_kb": 64, 00:16:47.262 "state": "online", 00:16:47.262 "raid_level": "concat", 00:16:47.262 "superblock": true, 00:16:47.262 "num_base_bdevs": 3, 00:16:47.262 "num_base_bdevs_discovered": 3, 00:16:47.262 "num_base_bdevs_operational": 3, 00:16:47.262 "base_bdevs_list": [ 00:16:47.262 { 00:16:47.262 "name": "BaseBdev1", 00:16:47.262 "uuid": "fde0cd26-ca34-43e4-92ae-b5e350496543", 00:16:47.262 "is_configured": true, 00:16:47.262 "data_offset": 2048, 00:16:47.262 "data_size": 63488 00:16:47.262 }, 00:16:47.262 { 00:16:47.262 "name": "BaseBdev2", 00:16:47.262 "uuid": "965a7f79-0bc8-4bb1-b8a8-f8b45fe82994", 00:16:47.262 "is_configured": true, 00:16:47.262 "data_offset": 2048, 00:16:47.262 "data_size": 63488 00:16:47.262 }, 00:16:47.262 { 00:16:47.262 "name": "BaseBdev3", 00:16:47.262 "uuid": "c4f70158-8e3c-4215-ad25-973c79aeaf30", 00:16:47.262 "is_configured": true, 00:16:47.262 "data_offset": 2048, 00:16:47.262 "data_size": 63488 00:16:47.262 } 00:16:47.262 ] 00:16:47.262 } 00:16:47.262 } 00:16:47.262 }' 00:16:47.262 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:47.262 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:47.262 BaseBdev2 00:16:47.262 BaseBdev3' 00:16:47.262 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:47.262 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:47.262 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:47.521 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:47.521 "name": "BaseBdev1", 00:16:47.521 "aliases": [ 00:16:47.521 "fde0cd26-ca34-43e4-92ae-b5e350496543" 00:16:47.521 ], 00:16:47.521 "product_name": "Malloc disk", 00:16:47.521 "block_size": 512, 00:16:47.521 "num_blocks": 65536, 00:16:47.521 "uuid": "fde0cd26-ca34-43e4-92ae-b5e350496543", 00:16:47.521 "assigned_rate_limits": { 00:16:47.521 "rw_ios_per_sec": 0, 00:16:47.521 "rw_mbytes_per_sec": 0, 00:16:47.521 "r_mbytes_per_sec": 0, 00:16:47.521 "w_mbytes_per_sec": 0 00:16:47.521 }, 00:16:47.521 "claimed": true, 00:16:47.521 "claim_type": "exclusive_write", 00:16:47.521 "zoned": false, 00:16:47.521 "supported_io_types": { 00:16:47.521 "read": true, 00:16:47.521 "write": true, 00:16:47.521 "unmap": true, 00:16:47.521 "flush": true, 00:16:47.521 "reset": true, 00:16:47.521 "nvme_admin": false, 00:16:47.521 "nvme_io": false, 00:16:47.521 "nvme_io_md": false, 00:16:47.521 "write_zeroes": true, 00:16:47.521 "zcopy": true, 00:16:47.521 "get_zone_info": false, 00:16:47.521 "zone_management": false, 00:16:47.521 "zone_append": false, 00:16:47.521 "compare": false, 00:16:47.521 "compare_and_write": false, 00:16:47.521 "abort": true, 00:16:47.521 "seek_hole": false, 00:16:47.521 "seek_data": false, 00:16:47.521 "copy": true, 00:16:47.521 "nvme_iov_md": false 00:16:47.521 }, 00:16:47.521 "memory_domains": [ 00:16:47.521 { 00:16:47.521 "dma_device_id": "system", 00:16:47.521 "dma_device_type": 1 00:16:47.521 }, 00:16:47.521 { 00:16:47.521 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:47.521 "dma_device_type": 2 00:16:47.521 } 00:16:47.521 ], 00:16:47.521 "driver_specific": {} 00:16:47.521 }' 00:16:47.521 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:47.521 11:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:47.521 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:47.521 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:47.521 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:47.780 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:48.347 "name": "BaseBdev2", 00:16:48.347 "aliases": [ 00:16:48.347 "965a7f79-0bc8-4bb1-b8a8-f8b45fe82994" 00:16:48.347 ], 00:16:48.347 "product_name": "Malloc disk", 00:16:48.347 "block_size": 512, 00:16:48.347 "num_blocks": 65536, 00:16:48.347 "uuid": "965a7f79-0bc8-4bb1-b8a8-f8b45fe82994", 00:16:48.347 "assigned_rate_limits": { 00:16:48.347 "rw_ios_per_sec": 0, 00:16:48.347 "rw_mbytes_per_sec": 0, 00:16:48.347 "r_mbytes_per_sec": 0, 00:16:48.347 "w_mbytes_per_sec": 0 00:16:48.347 }, 00:16:48.347 "claimed": true, 00:16:48.347 "claim_type": "exclusive_write", 00:16:48.347 "zoned": false, 00:16:48.347 "supported_io_types": { 00:16:48.347 "read": true, 00:16:48.347 "write": true, 00:16:48.347 "unmap": true, 00:16:48.347 "flush": true, 00:16:48.347 "reset": true, 00:16:48.347 "nvme_admin": false, 00:16:48.347 "nvme_io": false, 00:16:48.347 "nvme_io_md": false, 00:16:48.347 "write_zeroes": true, 00:16:48.347 "zcopy": true, 00:16:48.347 "get_zone_info": false, 00:16:48.347 "zone_management": false, 00:16:48.347 "zone_append": false, 00:16:48.347 "compare": false, 00:16:48.347 "compare_and_write": false, 00:16:48.347 "abort": true, 00:16:48.347 "seek_hole": false, 00:16:48.347 "seek_data": false, 00:16:48.347 "copy": true, 00:16:48.347 "nvme_iov_md": false 00:16:48.347 }, 00:16:48.347 "memory_domains": [ 00:16:48.347 { 00:16:48.347 "dma_device_id": "system", 00:16:48.347 "dma_device_type": 1 00:16:48.347 }, 00:16:48.347 { 00:16:48.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.347 "dma_device_type": 2 00:16:48.347 } 00:16:48.347 ], 00:16:48.347 "driver_specific": {} 00:16:48.347 }' 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:48.347 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.607 11:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:48.607 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:48.607 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:48.607 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:48.607 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:48.866 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:48.866 "name": "BaseBdev3", 00:16:48.866 "aliases": [ 00:16:48.866 "c4f70158-8e3c-4215-ad25-973c79aeaf30" 00:16:48.866 ], 00:16:48.866 "product_name": "Malloc disk", 00:16:48.866 "block_size": 512, 00:16:48.866 "num_blocks": 65536, 00:16:48.866 "uuid": "c4f70158-8e3c-4215-ad25-973c79aeaf30", 00:16:48.866 "assigned_rate_limits": { 00:16:48.866 "rw_ios_per_sec": 0, 00:16:48.866 "rw_mbytes_per_sec": 0, 00:16:48.866 "r_mbytes_per_sec": 0, 00:16:48.866 "w_mbytes_per_sec": 0 00:16:48.866 }, 00:16:48.866 "claimed": true, 00:16:48.866 "claim_type": "exclusive_write", 00:16:48.866 "zoned": false, 00:16:48.866 "supported_io_types": { 00:16:48.866 "read": true, 00:16:48.866 "write": true, 00:16:48.866 "unmap": true, 00:16:48.866 "flush": true, 00:16:48.866 "reset": true, 00:16:48.866 "nvme_admin": false, 00:16:48.866 "nvme_io": false, 00:16:48.866 "nvme_io_md": false, 00:16:48.866 "write_zeroes": true, 00:16:48.866 "zcopy": true, 00:16:48.866 "get_zone_info": false, 00:16:48.866 "zone_management": false, 00:16:48.866 "zone_append": false, 00:16:48.866 "compare": false, 00:16:48.866 "compare_and_write": false, 00:16:48.866 "abort": true, 00:16:48.866 "seek_hole": false, 00:16:48.866 "seek_data": false, 00:16:48.866 "copy": true, 00:16:48.866 "nvme_iov_md": false 00:16:48.866 }, 00:16:48.866 "memory_domains": [ 00:16:48.866 { 00:16:48.866 "dma_device_id": "system", 00:16:48.866 "dma_device_type": 1 00:16:48.866 }, 00:16:48.866 { 00:16:48.866 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:48.866 "dma_device_type": 2 00:16:48.866 } 00:16:48.866 ], 00:16:48.866 "driver_specific": {} 00:16:48.866 }' 00:16:48.866 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.866 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:48.866 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:48.866 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:48.866 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:49.126 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:49.126 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.126 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:49.126 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:49.126 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.126 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:49.126 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:49.126 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:49.385 [2024-07-15 11:58:02.920410] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:49.385 [2024-07-15 11:58:02.920442] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:49.385 [2024-07-15 11:58:02.920485] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.385 11:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.644 11:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.644 "name": "Existed_Raid", 00:16:49.644 "uuid": "d409b648-d32f-43e6-9c86-7aa9467cb46d", 00:16:49.644 "strip_size_kb": 64, 00:16:49.644 "state": "offline", 00:16:49.644 "raid_level": "concat", 00:16:49.644 "superblock": true, 00:16:49.644 "num_base_bdevs": 3, 00:16:49.644 "num_base_bdevs_discovered": 2, 00:16:49.644 "num_base_bdevs_operational": 2, 00:16:49.644 "base_bdevs_list": [ 00:16:49.644 { 00:16:49.644 "name": null, 00:16:49.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:49.644 "is_configured": false, 00:16:49.644 "data_offset": 2048, 00:16:49.644 "data_size": 63488 00:16:49.644 }, 00:16:49.644 { 00:16:49.644 "name": "BaseBdev2", 00:16:49.644 "uuid": "965a7f79-0bc8-4bb1-b8a8-f8b45fe82994", 00:16:49.644 "is_configured": true, 00:16:49.644 "data_offset": 2048, 00:16:49.644 "data_size": 63488 00:16:49.644 }, 00:16:49.644 { 00:16:49.644 "name": "BaseBdev3", 00:16:49.644 "uuid": "c4f70158-8e3c-4215-ad25-973c79aeaf30", 00:16:49.644 "is_configured": true, 00:16:49.644 "data_offset": 2048, 00:16:49.644 "data_size": 63488 00:16:49.644 } 00:16:49.644 ] 00:16:49.644 }' 00:16:49.644 11:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.644 11:58:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:50.580 11:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:50.580 11:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:50.580 11:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.580 11:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:50.580 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:50.580 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:50.580 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:50.840 [2024-07-15 11:58:04.257318] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:50.840 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:50.840 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:50.840 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.840 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:51.098 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:51.099 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:51.099 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:51.358 [2024-07-15 11:58:04.697138] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:51.358 [2024-07-15 11:58:04.697184] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e8b480 name Existed_Raid, state offline 00:16:51.358 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:51.358 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:51.358 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:51.358 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:51.618 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:51.618 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:51.618 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:51.618 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:51.618 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:51.618 11:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:51.877 BaseBdev2 00:16:51.877 11:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:51.877 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:16:51.877 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:51.877 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:51.877 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:51.877 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:51.877 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:52.137 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:52.137 [ 00:16:52.137 { 00:16:52.137 "name": "BaseBdev2", 00:16:52.137 "aliases": [ 00:16:52.137 "4b388496-a064-4bd8-8709-5c8e4f28ac8d" 00:16:52.137 ], 00:16:52.137 "product_name": "Malloc disk", 00:16:52.137 "block_size": 512, 00:16:52.137 "num_blocks": 65536, 00:16:52.137 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:16:52.137 "assigned_rate_limits": { 00:16:52.137 "rw_ios_per_sec": 0, 00:16:52.137 "rw_mbytes_per_sec": 0, 00:16:52.137 "r_mbytes_per_sec": 0, 00:16:52.137 "w_mbytes_per_sec": 0 00:16:52.137 }, 00:16:52.137 "claimed": false, 00:16:52.137 "zoned": false, 00:16:52.137 "supported_io_types": { 00:16:52.137 "read": true, 00:16:52.137 "write": true, 00:16:52.137 "unmap": true, 00:16:52.137 "flush": true, 00:16:52.137 "reset": true, 00:16:52.137 "nvme_admin": false, 00:16:52.137 "nvme_io": false, 00:16:52.137 "nvme_io_md": false, 00:16:52.137 "write_zeroes": true, 00:16:52.137 "zcopy": true, 00:16:52.137 "get_zone_info": false, 00:16:52.137 "zone_management": false, 00:16:52.137 "zone_append": false, 00:16:52.137 "compare": false, 00:16:52.137 "compare_and_write": false, 00:16:52.137 "abort": true, 00:16:52.137 "seek_hole": false, 00:16:52.137 "seek_data": false, 00:16:52.137 "copy": true, 00:16:52.137 "nvme_iov_md": false 00:16:52.137 }, 00:16:52.137 "memory_domains": [ 00:16:52.137 { 00:16:52.137 "dma_device_id": "system", 00:16:52.137 "dma_device_type": 1 00:16:52.137 }, 00:16:52.137 { 00:16:52.137 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.137 "dma_device_type": 2 00:16:52.137 } 00:16:52.137 ], 00:16:52.137 "driver_specific": {} 00:16:52.137 } 00:16:52.137 ] 00:16:52.137 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:52.137 11:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:52.137 11:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:52.137 11:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:52.396 BaseBdev3 00:16:52.396 11:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:52.396 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:16:52.396 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:52.396 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:52.396 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:52.396 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:52.397 11:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:52.656 11:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:52.915 [ 00:16:52.915 { 00:16:52.915 "name": "BaseBdev3", 00:16:52.915 "aliases": [ 00:16:52.915 "3e16eb1b-927a-4c01-a309-4194281da5ad" 00:16:52.915 ], 00:16:52.915 "product_name": "Malloc disk", 00:16:52.915 "block_size": 512, 00:16:52.915 "num_blocks": 65536, 00:16:52.915 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:16:52.915 "assigned_rate_limits": { 00:16:52.915 "rw_ios_per_sec": 0, 00:16:52.915 "rw_mbytes_per_sec": 0, 00:16:52.915 "r_mbytes_per_sec": 0, 00:16:52.915 "w_mbytes_per_sec": 0 00:16:52.915 }, 00:16:52.915 "claimed": false, 00:16:52.915 "zoned": false, 00:16:52.915 "supported_io_types": { 00:16:52.915 "read": true, 00:16:52.915 "write": true, 00:16:52.915 "unmap": true, 00:16:52.915 "flush": true, 00:16:52.915 "reset": true, 00:16:52.915 "nvme_admin": false, 00:16:52.915 "nvme_io": false, 00:16:52.915 "nvme_io_md": false, 00:16:52.915 "write_zeroes": true, 00:16:52.915 "zcopy": true, 00:16:52.915 "get_zone_info": false, 00:16:52.915 "zone_management": false, 00:16:52.915 "zone_append": false, 00:16:52.915 "compare": false, 00:16:52.915 "compare_and_write": false, 00:16:52.915 "abort": true, 00:16:52.915 "seek_hole": false, 00:16:52.915 "seek_data": false, 00:16:52.915 "copy": true, 00:16:52.915 "nvme_iov_md": false 00:16:52.915 }, 00:16:52.915 "memory_domains": [ 00:16:52.915 { 00:16:52.915 "dma_device_id": "system", 00:16:52.915 "dma_device_type": 1 00:16:52.915 }, 00:16:52.915 { 00:16:52.915 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.915 "dma_device_type": 2 00:16:52.915 } 00:16:52.915 ], 00:16:52.915 "driver_specific": {} 00:16:52.915 } 00:16:52.915 ] 00:16:52.915 11:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:52.915 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:52.915 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:52.915 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:53.174 [2024-07-15 11:58:06.651098] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:53.174 [2024-07-15 11:58:06.651147] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:53.174 [2024-07-15 11:58:06.651169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:53.174 [2024-07-15 11:58:06.652716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:53.175 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:53.434 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:53.434 "name": "Existed_Raid", 00:16:53.434 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:16:53.434 "strip_size_kb": 64, 00:16:53.434 "state": "configuring", 00:16:53.434 "raid_level": "concat", 00:16:53.434 "superblock": true, 00:16:53.434 "num_base_bdevs": 3, 00:16:53.434 "num_base_bdevs_discovered": 2, 00:16:53.434 "num_base_bdevs_operational": 3, 00:16:53.434 "base_bdevs_list": [ 00:16:53.434 { 00:16:53.434 "name": "BaseBdev1", 00:16:53.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:53.434 "is_configured": false, 00:16:53.434 "data_offset": 0, 00:16:53.434 "data_size": 0 00:16:53.434 }, 00:16:53.434 { 00:16:53.434 "name": "BaseBdev2", 00:16:53.434 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:16:53.434 "is_configured": true, 00:16:53.434 "data_offset": 2048, 00:16:53.434 "data_size": 63488 00:16:53.434 }, 00:16:53.434 { 00:16:53.434 "name": "BaseBdev3", 00:16:53.434 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:16:53.434 "is_configured": true, 00:16:53.434 "data_offset": 2048, 00:16:53.434 "data_size": 63488 00:16:53.434 } 00:16:53.434 ] 00:16:53.434 }' 00:16:53.434 11:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:53.434 11:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:54.372 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:54.372 [2024-07-15 11:58:07.950519] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:54.372 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:54.372 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:54.631 11:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.631 11:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:54.631 "name": "Existed_Raid", 00:16:54.631 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:16:54.631 "strip_size_kb": 64, 00:16:54.631 "state": "configuring", 00:16:54.631 "raid_level": "concat", 00:16:54.631 "superblock": true, 00:16:54.631 "num_base_bdevs": 3, 00:16:54.631 "num_base_bdevs_discovered": 1, 00:16:54.631 "num_base_bdevs_operational": 3, 00:16:54.631 "base_bdevs_list": [ 00:16:54.631 { 00:16:54.631 "name": "BaseBdev1", 00:16:54.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:54.631 "is_configured": false, 00:16:54.631 "data_offset": 0, 00:16:54.631 "data_size": 0 00:16:54.631 }, 00:16:54.631 { 00:16:54.631 "name": null, 00:16:54.631 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:16:54.631 "is_configured": false, 00:16:54.631 "data_offset": 2048, 00:16:54.631 "data_size": 63488 00:16:54.631 }, 00:16:54.631 { 00:16:54.631 "name": "BaseBdev3", 00:16:54.631 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:16:54.631 "is_configured": true, 00:16:54.631 "data_offset": 2048, 00:16:54.631 "data_size": 63488 00:16:54.631 } 00:16:54.631 ] 00:16:54.631 }' 00:16:54.631 11:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:54.631 11:58:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:55.199 11:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.200 11:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:55.458 11:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:55.458 11:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:55.717 [2024-07-15 11:58:09.241325] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:55.717 BaseBdev1 00:16:55.717 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:55.717 11:58:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:16:55.717 11:58:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:16:55.717 11:58:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:16:55.717 11:58:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:16:55.717 11:58:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:16:55.717 11:58:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:55.976 11:58:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:56.235 [ 00:16:56.236 { 00:16:56.236 "name": "BaseBdev1", 00:16:56.236 "aliases": [ 00:16:56.236 "e019fe24-ce0a-4659-871e-b856dbe89a3d" 00:16:56.236 ], 00:16:56.236 "product_name": "Malloc disk", 00:16:56.236 "block_size": 512, 00:16:56.236 "num_blocks": 65536, 00:16:56.236 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:16:56.236 "assigned_rate_limits": { 00:16:56.236 "rw_ios_per_sec": 0, 00:16:56.236 "rw_mbytes_per_sec": 0, 00:16:56.236 "r_mbytes_per_sec": 0, 00:16:56.236 "w_mbytes_per_sec": 0 00:16:56.236 }, 00:16:56.236 "claimed": true, 00:16:56.236 "claim_type": "exclusive_write", 00:16:56.236 "zoned": false, 00:16:56.236 "supported_io_types": { 00:16:56.236 "read": true, 00:16:56.236 "write": true, 00:16:56.236 "unmap": true, 00:16:56.236 "flush": true, 00:16:56.236 "reset": true, 00:16:56.236 "nvme_admin": false, 00:16:56.236 "nvme_io": false, 00:16:56.236 "nvme_io_md": false, 00:16:56.236 "write_zeroes": true, 00:16:56.236 "zcopy": true, 00:16:56.236 "get_zone_info": false, 00:16:56.236 "zone_management": false, 00:16:56.236 "zone_append": false, 00:16:56.236 "compare": false, 00:16:56.236 "compare_and_write": false, 00:16:56.236 "abort": true, 00:16:56.236 "seek_hole": false, 00:16:56.236 "seek_data": false, 00:16:56.236 "copy": true, 00:16:56.236 "nvme_iov_md": false 00:16:56.236 }, 00:16:56.236 "memory_domains": [ 00:16:56.236 { 00:16:56.236 "dma_device_id": "system", 00:16:56.236 "dma_device_type": 1 00:16:56.236 }, 00:16:56.236 { 00:16:56.236 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.236 "dma_device_type": 2 00:16:56.236 } 00:16:56.236 ], 00:16:56.236 "driver_specific": {} 00:16:56.236 } 00:16:56.236 ] 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:56.236 11:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:56.495 11:58:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:56.495 "name": "Existed_Raid", 00:16:56.495 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:16:56.495 "strip_size_kb": 64, 00:16:56.495 "state": "configuring", 00:16:56.495 "raid_level": "concat", 00:16:56.495 "superblock": true, 00:16:56.495 "num_base_bdevs": 3, 00:16:56.495 "num_base_bdevs_discovered": 2, 00:16:56.495 "num_base_bdevs_operational": 3, 00:16:56.495 "base_bdevs_list": [ 00:16:56.496 { 00:16:56.496 "name": "BaseBdev1", 00:16:56.496 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:16:56.496 "is_configured": true, 00:16:56.496 "data_offset": 2048, 00:16:56.496 "data_size": 63488 00:16:56.496 }, 00:16:56.496 { 00:16:56.496 "name": null, 00:16:56.496 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:16:56.496 "is_configured": false, 00:16:56.496 "data_offset": 2048, 00:16:56.496 "data_size": 63488 00:16:56.496 }, 00:16:56.496 { 00:16:56.496 "name": "BaseBdev3", 00:16:56.496 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:16:56.496 "is_configured": true, 00:16:56.496 "data_offset": 2048, 00:16:56.496 "data_size": 63488 00:16:56.496 } 00:16:56.496 ] 00:16:56.496 }' 00:16:56.496 11:58:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:56.496 11:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:57.063 11:58:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.063 11:58:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:57.633 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:57.633 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:57.893 [2024-07-15 11:58:11.403107] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.893 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.153 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.153 "name": "Existed_Raid", 00:16:58.153 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:16:58.153 "strip_size_kb": 64, 00:16:58.153 "state": "configuring", 00:16:58.153 "raid_level": "concat", 00:16:58.153 "superblock": true, 00:16:58.153 "num_base_bdevs": 3, 00:16:58.153 "num_base_bdevs_discovered": 1, 00:16:58.153 "num_base_bdevs_operational": 3, 00:16:58.153 "base_bdevs_list": [ 00:16:58.153 { 00:16:58.153 "name": "BaseBdev1", 00:16:58.153 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:16:58.153 "is_configured": true, 00:16:58.153 "data_offset": 2048, 00:16:58.153 "data_size": 63488 00:16:58.153 }, 00:16:58.153 { 00:16:58.153 "name": null, 00:16:58.153 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:16:58.153 "is_configured": false, 00:16:58.153 "data_offset": 2048, 00:16:58.153 "data_size": 63488 00:16:58.153 }, 00:16:58.153 { 00:16:58.153 "name": null, 00:16:58.153 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:16:58.153 "is_configured": false, 00:16:58.153 "data_offset": 2048, 00:16:58.153 "data_size": 63488 00:16:58.153 } 00:16:58.153 ] 00:16:58.153 }' 00:16:58.153 11:58:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.153 11:58:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:58.722 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.722 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:58.981 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:58.981 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:59.241 [2024-07-15 11:58:12.766745] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.241 11:58:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:59.809 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.809 "name": "Existed_Raid", 00:16:59.809 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:16:59.809 "strip_size_kb": 64, 00:16:59.809 "state": "configuring", 00:16:59.809 "raid_level": "concat", 00:16:59.809 "superblock": true, 00:16:59.809 "num_base_bdevs": 3, 00:16:59.809 "num_base_bdevs_discovered": 2, 00:16:59.809 "num_base_bdevs_operational": 3, 00:16:59.809 "base_bdevs_list": [ 00:16:59.809 { 00:16:59.809 "name": "BaseBdev1", 00:16:59.809 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:16:59.809 "is_configured": true, 00:16:59.809 "data_offset": 2048, 00:16:59.809 "data_size": 63488 00:16:59.809 }, 00:16:59.809 { 00:16:59.809 "name": null, 00:16:59.809 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:16:59.809 "is_configured": false, 00:16:59.809 "data_offset": 2048, 00:16:59.809 "data_size": 63488 00:16:59.809 }, 00:16:59.809 { 00:16:59.809 "name": "BaseBdev3", 00:16:59.809 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:16:59.809 "is_configured": true, 00:16:59.809 "data_offset": 2048, 00:16:59.809 "data_size": 63488 00:16:59.809 } 00:16:59.809 ] 00:16:59.809 }' 00:16:59.809 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.809 11:58:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:00.376 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.376 11:58:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:00.634 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:00.634 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:00.893 [2024-07-15 11:58:14.383112] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:00.893 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.460 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.460 "name": "Existed_Raid", 00:17:01.460 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:17:01.460 "strip_size_kb": 64, 00:17:01.460 "state": "configuring", 00:17:01.460 "raid_level": "concat", 00:17:01.460 "superblock": true, 00:17:01.460 "num_base_bdevs": 3, 00:17:01.460 "num_base_bdevs_discovered": 1, 00:17:01.460 "num_base_bdevs_operational": 3, 00:17:01.460 "base_bdevs_list": [ 00:17:01.460 { 00:17:01.460 "name": null, 00:17:01.460 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:17:01.460 "is_configured": false, 00:17:01.460 "data_offset": 2048, 00:17:01.460 "data_size": 63488 00:17:01.460 }, 00:17:01.460 { 00:17:01.460 "name": null, 00:17:01.460 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:17:01.460 "is_configured": false, 00:17:01.460 "data_offset": 2048, 00:17:01.460 "data_size": 63488 00:17:01.460 }, 00:17:01.460 { 00:17:01.460 "name": "BaseBdev3", 00:17:01.460 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:17:01.460 "is_configured": true, 00:17:01.460 "data_offset": 2048, 00:17:01.460 "data_size": 63488 00:17:01.460 } 00:17:01.460 ] 00:17:01.460 }' 00:17:01.460 11:58:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.460 11:58:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:02.049 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.049 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:02.308 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:02.308 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:02.568 [2024-07-15 11:58:15.939543] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.568 11:58:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.828 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.828 "name": "Existed_Raid", 00:17:02.828 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:17:02.828 "strip_size_kb": 64, 00:17:02.828 "state": "configuring", 00:17:02.828 "raid_level": "concat", 00:17:02.828 "superblock": true, 00:17:02.828 "num_base_bdevs": 3, 00:17:02.828 "num_base_bdevs_discovered": 2, 00:17:02.828 "num_base_bdevs_operational": 3, 00:17:02.828 "base_bdevs_list": [ 00:17:02.828 { 00:17:02.828 "name": null, 00:17:02.828 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:17:02.828 "is_configured": false, 00:17:02.828 "data_offset": 2048, 00:17:02.828 "data_size": 63488 00:17:02.828 }, 00:17:02.828 { 00:17:02.828 "name": "BaseBdev2", 00:17:02.828 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:17:02.828 "is_configured": true, 00:17:02.828 "data_offset": 2048, 00:17:02.828 "data_size": 63488 00:17:02.828 }, 00:17:02.828 { 00:17:02.828 "name": "BaseBdev3", 00:17:02.828 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:17:02.828 "is_configured": true, 00:17:02.828 "data_offset": 2048, 00:17:02.828 "data_size": 63488 00:17:02.828 } 00:17:02.828 ] 00:17:02.828 }' 00:17:02.828 11:58:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.828 11:58:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:03.765 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:03.765 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:04.027 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:04.027 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.027 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:04.027 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u e019fe24-ce0a-4659-871e-b856dbe89a3d 00:17:04.327 [2024-07-15 11:58:17.837254] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:04.327 [2024-07-15 11:58:17.837417] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e8bc10 00:17:04.327 [2024-07-15 11:58:17.837430] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:04.327 [2024-07-15 11:58:17.837608] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e8d8f0 00:17:04.327 [2024-07-15 11:58:17.837742] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e8bc10 00:17:04.327 [2024-07-15 11:58:17.837753] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1e8bc10 00:17:04.327 [2024-07-15 11:58:17.837852] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:04.327 NewBaseBdev 00:17:04.327 11:58:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:04.327 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:17:04.327 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:04.327 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:17:04.327 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:04.327 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:04.327 11:58:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:04.586 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:04.845 [ 00:17:04.845 { 00:17:04.845 "name": "NewBaseBdev", 00:17:04.845 "aliases": [ 00:17:04.845 "e019fe24-ce0a-4659-871e-b856dbe89a3d" 00:17:04.845 ], 00:17:04.845 "product_name": "Malloc disk", 00:17:04.845 "block_size": 512, 00:17:04.845 "num_blocks": 65536, 00:17:04.845 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:17:04.845 "assigned_rate_limits": { 00:17:04.845 "rw_ios_per_sec": 0, 00:17:04.845 "rw_mbytes_per_sec": 0, 00:17:04.845 "r_mbytes_per_sec": 0, 00:17:04.845 "w_mbytes_per_sec": 0 00:17:04.845 }, 00:17:04.845 "claimed": true, 00:17:04.845 "claim_type": "exclusive_write", 00:17:04.845 "zoned": false, 00:17:04.845 "supported_io_types": { 00:17:04.845 "read": true, 00:17:04.845 "write": true, 00:17:04.845 "unmap": true, 00:17:04.845 "flush": true, 00:17:04.845 "reset": true, 00:17:04.845 "nvme_admin": false, 00:17:04.845 "nvme_io": false, 00:17:04.845 "nvme_io_md": false, 00:17:04.845 "write_zeroes": true, 00:17:04.845 "zcopy": true, 00:17:04.845 "get_zone_info": false, 00:17:04.845 "zone_management": false, 00:17:04.845 "zone_append": false, 00:17:04.845 "compare": false, 00:17:04.845 "compare_and_write": false, 00:17:04.845 "abort": true, 00:17:04.845 "seek_hole": false, 00:17:04.845 "seek_data": false, 00:17:04.845 "copy": true, 00:17:04.845 "nvme_iov_md": false 00:17:04.845 }, 00:17:04.845 "memory_domains": [ 00:17:04.845 { 00:17:04.845 "dma_device_id": "system", 00:17:04.845 "dma_device_type": 1 00:17:04.845 }, 00:17:04.845 { 00:17:04.845 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.845 "dma_device_type": 2 00:17:04.845 } 00:17:04.845 ], 00:17:04.845 "driver_specific": {} 00:17:04.845 } 00:17:04.845 ] 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.845 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:05.103 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:05.103 "name": "Existed_Raid", 00:17:05.103 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:17:05.103 "strip_size_kb": 64, 00:17:05.103 "state": "online", 00:17:05.103 "raid_level": "concat", 00:17:05.103 "superblock": true, 00:17:05.103 "num_base_bdevs": 3, 00:17:05.103 "num_base_bdevs_discovered": 3, 00:17:05.103 "num_base_bdevs_operational": 3, 00:17:05.103 "base_bdevs_list": [ 00:17:05.103 { 00:17:05.103 "name": "NewBaseBdev", 00:17:05.103 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:17:05.103 "is_configured": true, 00:17:05.103 "data_offset": 2048, 00:17:05.103 "data_size": 63488 00:17:05.103 }, 00:17:05.103 { 00:17:05.103 "name": "BaseBdev2", 00:17:05.103 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:17:05.103 "is_configured": true, 00:17:05.103 "data_offset": 2048, 00:17:05.103 "data_size": 63488 00:17:05.103 }, 00:17:05.103 { 00:17:05.103 "name": "BaseBdev3", 00:17:05.103 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:17:05.103 "is_configured": true, 00:17:05.103 "data_offset": 2048, 00:17:05.103 "data_size": 63488 00:17:05.103 } 00:17:05.103 ] 00:17:05.103 }' 00:17:05.103 11:58:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:05.103 11:58:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:05.670 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:05.670 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:05.670 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:05.670 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:05.670 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:05.670 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:05.670 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:05.670 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:05.929 [2024-07-15 11:58:19.413744] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:05.929 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:05.929 "name": "Existed_Raid", 00:17:05.929 "aliases": [ 00:17:05.929 "8a63add6-5427-46bb-8597-0c4da0bc0897" 00:17:05.929 ], 00:17:05.929 "product_name": "Raid Volume", 00:17:05.929 "block_size": 512, 00:17:05.929 "num_blocks": 190464, 00:17:05.929 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:17:05.929 "assigned_rate_limits": { 00:17:05.929 "rw_ios_per_sec": 0, 00:17:05.929 "rw_mbytes_per_sec": 0, 00:17:05.929 "r_mbytes_per_sec": 0, 00:17:05.929 "w_mbytes_per_sec": 0 00:17:05.929 }, 00:17:05.929 "claimed": false, 00:17:05.929 "zoned": false, 00:17:05.929 "supported_io_types": { 00:17:05.929 "read": true, 00:17:05.929 "write": true, 00:17:05.929 "unmap": true, 00:17:05.929 "flush": true, 00:17:05.929 "reset": true, 00:17:05.929 "nvme_admin": false, 00:17:05.930 "nvme_io": false, 00:17:05.930 "nvme_io_md": false, 00:17:05.930 "write_zeroes": true, 00:17:05.930 "zcopy": false, 00:17:05.930 "get_zone_info": false, 00:17:05.930 "zone_management": false, 00:17:05.930 "zone_append": false, 00:17:05.930 "compare": false, 00:17:05.930 "compare_and_write": false, 00:17:05.930 "abort": false, 00:17:05.930 "seek_hole": false, 00:17:05.930 "seek_data": false, 00:17:05.930 "copy": false, 00:17:05.930 "nvme_iov_md": false 00:17:05.930 }, 00:17:05.930 "memory_domains": [ 00:17:05.930 { 00:17:05.930 "dma_device_id": "system", 00:17:05.930 "dma_device_type": 1 00:17:05.930 }, 00:17:05.930 { 00:17:05.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.930 "dma_device_type": 2 00:17:05.930 }, 00:17:05.930 { 00:17:05.930 "dma_device_id": "system", 00:17:05.930 "dma_device_type": 1 00:17:05.930 }, 00:17:05.930 { 00:17:05.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.930 "dma_device_type": 2 00:17:05.930 }, 00:17:05.930 { 00:17:05.930 "dma_device_id": "system", 00:17:05.930 "dma_device_type": 1 00:17:05.930 }, 00:17:05.930 { 00:17:05.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.930 "dma_device_type": 2 00:17:05.930 } 00:17:05.930 ], 00:17:05.930 "driver_specific": { 00:17:05.930 "raid": { 00:17:05.930 "uuid": "8a63add6-5427-46bb-8597-0c4da0bc0897", 00:17:05.930 "strip_size_kb": 64, 00:17:05.930 "state": "online", 00:17:05.930 "raid_level": "concat", 00:17:05.930 "superblock": true, 00:17:05.930 "num_base_bdevs": 3, 00:17:05.930 "num_base_bdevs_discovered": 3, 00:17:05.930 "num_base_bdevs_operational": 3, 00:17:05.930 "base_bdevs_list": [ 00:17:05.930 { 00:17:05.930 "name": "NewBaseBdev", 00:17:05.930 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:17:05.930 "is_configured": true, 00:17:05.930 "data_offset": 2048, 00:17:05.930 "data_size": 63488 00:17:05.930 }, 00:17:05.930 { 00:17:05.930 "name": "BaseBdev2", 00:17:05.930 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:17:05.930 "is_configured": true, 00:17:05.930 "data_offset": 2048, 00:17:05.930 "data_size": 63488 00:17:05.930 }, 00:17:05.930 { 00:17:05.930 "name": "BaseBdev3", 00:17:05.930 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:17:05.930 "is_configured": true, 00:17:05.930 "data_offset": 2048, 00:17:05.930 "data_size": 63488 00:17:05.930 } 00:17:05.930 ] 00:17:05.930 } 00:17:05.930 } 00:17:05.930 }' 00:17:05.930 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:05.930 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:05.930 BaseBdev2 00:17:05.930 BaseBdev3' 00:17:05.930 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:05.930 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:05.930 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.189 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.189 "name": "NewBaseBdev", 00:17:06.189 "aliases": [ 00:17:06.189 "e019fe24-ce0a-4659-871e-b856dbe89a3d" 00:17:06.189 ], 00:17:06.189 "product_name": "Malloc disk", 00:17:06.189 "block_size": 512, 00:17:06.189 "num_blocks": 65536, 00:17:06.189 "uuid": "e019fe24-ce0a-4659-871e-b856dbe89a3d", 00:17:06.189 "assigned_rate_limits": { 00:17:06.189 "rw_ios_per_sec": 0, 00:17:06.189 "rw_mbytes_per_sec": 0, 00:17:06.189 "r_mbytes_per_sec": 0, 00:17:06.189 "w_mbytes_per_sec": 0 00:17:06.189 }, 00:17:06.189 "claimed": true, 00:17:06.189 "claim_type": "exclusive_write", 00:17:06.189 "zoned": false, 00:17:06.189 "supported_io_types": { 00:17:06.189 "read": true, 00:17:06.189 "write": true, 00:17:06.189 "unmap": true, 00:17:06.189 "flush": true, 00:17:06.189 "reset": true, 00:17:06.189 "nvme_admin": false, 00:17:06.189 "nvme_io": false, 00:17:06.189 "nvme_io_md": false, 00:17:06.189 "write_zeroes": true, 00:17:06.189 "zcopy": true, 00:17:06.189 "get_zone_info": false, 00:17:06.189 "zone_management": false, 00:17:06.189 "zone_append": false, 00:17:06.189 "compare": false, 00:17:06.189 "compare_and_write": false, 00:17:06.189 "abort": true, 00:17:06.189 "seek_hole": false, 00:17:06.189 "seek_data": false, 00:17:06.189 "copy": true, 00:17:06.189 "nvme_iov_md": false 00:17:06.189 }, 00:17:06.189 "memory_domains": [ 00:17:06.189 { 00:17:06.189 "dma_device_id": "system", 00:17:06.189 "dma_device_type": 1 00:17:06.189 }, 00:17:06.189 { 00:17:06.189 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.189 "dma_device_type": 2 00:17:06.189 } 00:17:06.189 ], 00:17:06.189 "driver_specific": {} 00:17:06.189 }' 00:17:06.189 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.189 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.448 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.448 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.448 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.448 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.448 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.448 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.448 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.448 11:58:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.708 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.708 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:06.708 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:06.708 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:06.708 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.708 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.708 "name": "BaseBdev2", 00:17:06.708 "aliases": [ 00:17:06.708 "4b388496-a064-4bd8-8709-5c8e4f28ac8d" 00:17:06.708 ], 00:17:06.708 "product_name": "Malloc disk", 00:17:06.708 "block_size": 512, 00:17:06.708 "num_blocks": 65536, 00:17:06.708 "uuid": "4b388496-a064-4bd8-8709-5c8e4f28ac8d", 00:17:06.708 "assigned_rate_limits": { 00:17:06.708 "rw_ios_per_sec": 0, 00:17:06.708 "rw_mbytes_per_sec": 0, 00:17:06.708 "r_mbytes_per_sec": 0, 00:17:06.708 "w_mbytes_per_sec": 0 00:17:06.708 }, 00:17:06.708 "claimed": true, 00:17:06.708 "claim_type": "exclusive_write", 00:17:06.708 "zoned": false, 00:17:06.708 "supported_io_types": { 00:17:06.708 "read": true, 00:17:06.708 "write": true, 00:17:06.708 "unmap": true, 00:17:06.708 "flush": true, 00:17:06.708 "reset": true, 00:17:06.708 "nvme_admin": false, 00:17:06.708 "nvme_io": false, 00:17:06.708 "nvme_io_md": false, 00:17:06.708 "write_zeroes": true, 00:17:06.708 "zcopy": true, 00:17:06.708 "get_zone_info": false, 00:17:06.708 "zone_management": false, 00:17:06.708 "zone_append": false, 00:17:06.708 "compare": false, 00:17:06.708 "compare_and_write": false, 00:17:06.708 "abort": true, 00:17:06.708 "seek_hole": false, 00:17:06.708 "seek_data": false, 00:17:06.708 "copy": true, 00:17:06.708 "nvme_iov_md": false 00:17:06.708 }, 00:17:06.708 "memory_domains": [ 00:17:06.708 { 00:17:06.708 "dma_device_id": "system", 00:17:06.708 "dma_device_type": 1 00:17:06.708 }, 00:17:06.708 { 00:17:06.708 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.708 "dma_device_type": 2 00:17:06.708 } 00:17:06.708 ], 00:17:06.708 "driver_specific": {} 00:17:06.708 }' 00:17:06.708 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.967 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.967 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.967 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.967 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.967 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.967 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.967 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.967 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.967 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.226 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.226 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.226 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.226 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:07.226 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.485 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.485 "name": "BaseBdev3", 00:17:07.485 "aliases": [ 00:17:07.485 "3e16eb1b-927a-4c01-a309-4194281da5ad" 00:17:07.485 ], 00:17:07.485 "product_name": "Malloc disk", 00:17:07.485 "block_size": 512, 00:17:07.485 "num_blocks": 65536, 00:17:07.485 "uuid": "3e16eb1b-927a-4c01-a309-4194281da5ad", 00:17:07.485 "assigned_rate_limits": { 00:17:07.485 "rw_ios_per_sec": 0, 00:17:07.485 "rw_mbytes_per_sec": 0, 00:17:07.485 "r_mbytes_per_sec": 0, 00:17:07.485 "w_mbytes_per_sec": 0 00:17:07.485 }, 00:17:07.485 "claimed": true, 00:17:07.485 "claim_type": "exclusive_write", 00:17:07.485 "zoned": false, 00:17:07.485 "supported_io_types": { 00:17:07.485 "read": true, 00:17:07.485 "write": true, 00:17:07.485 "unmap": true, 00:17:07.485 "flush": true, 00:17:07.485 "reset": true, 00:17:07.485 "nvme_admin": false, 00:17:07.485 "nvme_io": false, 00:17:07.485 "nvme_io_md": false, 00:17:07.485 "write_zeroes": true, 00:17:07.485 "zcopy": true, 00:17:07.485 "get_zone_info": false, 00:17:07.485 "zone_management": false, 00:17:07.485 "zone_append": false, 00:17:07.485 "compare": false, 00:17:07.485 "compare_and_write": false, 00:17:07.485 "abort": true, 00:17:07.485 "seek_hole": false, 00:17:07.485 "seek_data": false, 00:17:07.485 "copy": true, 00:17:07.485 "nvme_iov_md": false 00:17:07.485 }, 00:17:07.485 "memory_domains": [ 00:17:07.485 { 00:17:07.486 "dma_device_id": "system", 00:17:07.486 "dma_device_type": 1 00:17:07.486 }, 00:17:07.486 { 00:17:07.486 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.486 "dma_device_type": 2 00:17:07.486 } 00:17:07.486 ], 00:17:07.486 "driver_specific": {} 00:17:07.486 }' 00:17:07.486 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.486 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.486 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.486 11:58:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.486 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.486 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:07.486 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.744 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.744 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.744 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.744 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.744 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.744 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:08.311 [2024-07-15 11:58:21.711575] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:08.311 [2024-07-15 11:58:21.711607] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:08.311 [2024-07-15 11:58:21.711671] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:08.311 [2024-07-15 11:58:21.711734] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:08.311 [2024-07-15 11:58:21.711748] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e8bc10 name Existed_Raid, state offline 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1494310 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1494310 ']' 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1494310 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1494310 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1494310' 00:17:08.311 killing process with pid 1494310 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1494310 00:17:08.311 [2024-07-15 11:58:21.794874] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:08.311 11:58:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1494310 00:17:08.311 [2024-07-15 11:58:21.822096] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:08.569 11:58:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:08.569 00:17:08.569 real 0m30.213s 00:17:08.569 user 0m55.623s 00:17:08.569 sys 0m5.310s 00:17:08.569 11:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:08.569 11:58:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:08.569 ************************************ 00:17:08.569 END TEST raid_state_function_test_sb 00:17:08.569 ************************************ 00:17:08.569 11:58:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:08.569 11:58:22 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:17:08.569 11:58:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:17:08.569 11:58:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:08.569 11:58:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:08.569 ************************************ 00:17:08.569 START TEST raid_superblock_test 00:17:08.569 ************************************ 00:17:08.569 11:58:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 3 00:17:08.569 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:17:08.569 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:17:08.569 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:08.569 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:08.569 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1498788 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1498788 /var/tmp/spdk-raid.sock 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1498788 ']' 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:08.570 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:08.570 11:58:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:08.827 [2024-07-15 11:58:22.178433] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:17:08.827 [2024-07-15 11:58:22.178504] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1498788 ] 00:17:08.827 [2024-07-15 11:58:22.304280] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:08.827 [2024-07-15 11:58:22.406216] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:09.085 [2024-07-15 11:58:22.471989] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:09.085 [2024-07-15 11:58:22.472028] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:09.651 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:09.909 malloc1 00:17:09.909 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:10.168 [2024-07-15 11:58:23.576561] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:10.168 [2024-07-15 11:58:23.576609] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:10.168 [2024-07-15 11:58:23.576630] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b7e560 00:17:10.168 [2024-07-15 11:58:23.576642] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:10.168 [2024-07-15 11:58:23.578286] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:10.168 [2024-07-15 11:58:23.578315] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:10.168 pt1 00:17:10.168 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:10.168 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:10.168 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:10.168 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:10.168 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:10.168 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:10.168 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:10.168 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:10.168 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:10.426 malloc2 00:17:10.426 11:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:10.685 [2024-07-15 11:58:24.066529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:10.685 [2024-07-15 11:58:24.066573] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:10.685 [2024-07-15 11:58:24.066596] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c1c5b0 00:17:10.685 [2024-07-15 11:58:24.066608] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:10.685 [2024-07-15 11:58:24.068123] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:10.685 [2024-07-15 11:58:24.068152] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:10.685 pt2 00:17:10.685 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:10.685 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:10.685 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:10.685 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:10.685 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:10.685 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:10.685 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:10.685 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:10.685 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:10.944 malloc3 00:17:10.944 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:11.513 [2024-07-15 11:58:24.821810] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:11.513 [2024-07-15 11:58:24.821868] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:11.513 [2024-07-15 11:58:24.821885] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c1cbe0 00:17:11.513 [2024-07-15 11:58:24.821897] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:11.513 [2024-07-15 11:58:24.823471] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:11.513 [2024-07-15 11:58:24.823498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:11.513 pt3 00:17:11.513 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:11.513 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:11.513 11:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:17:11.513 [2024-07-15 11:58:25.078499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:11.513 [2024-07-15 11:58:25.079841] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:11.513 [2024-07-15 11:58:25.079898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:11.513 [2024-07-15 11:58:25.080042] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c1d510 00:17:11.513 [2024-07-15 11:58:25.080054] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:11.513 [2024-07-15 11:58:25.080253] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b81ef0 00:17:11.513 [2024-07-15 11:58:25.080392] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c1d510 00:17:11.513 [2024-07-15 11:58:25.080402] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c1d510 00:17:11.513 [2024-07-15 11:58:25.080499] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:11.513 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:11.772 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:11.772 "name": "raid_bdev1", 00:17:11.772 "uuid": "f2752fc0-a840-4b4e-b8b5-9a59467898fd", 00:17:11.772 "strip_size_kb": 64, 00:17:11.772 "state": "online", 00:17:11.772 "raid_level": "concat", 00:17:11.772 "superblock": true, 00:17:11.772 "num_base_bdevs": 3, 00:17:11.772 "num_base_bdevs_discovered": 3, 00:17:11.772 "num_base_bdevs_operational": 3, 00:17:11.772 "base_bdevs_list": [ 00:17:11.772 { 00:17:11.772 "name": "pt1", 00:17:11.772 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:11.772 "is_configured": true, 00:17:11.772 "data_offset": 2048, 00:17:11.772 "data_size": 63488 00:17:11.772 }, 00:17:11.772 { 00:17:11.772 "name": "pt2", 00:17:11.772 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:11.772 "is_configured": true, 00:17:11.773 "data_offset": 2048, 00:17:11.773 "data_size": 63488 00:17:11.773 }, 00:17:11.773 { 00:17:11.773 "name": "pt3", 00:17:11.773 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:11.773 "is_configured": true, 00:17:11.773 "data_offset": 2048, 00:17:11.773 "data_size": 63488 00:17:11.773 } 00:17:11.773 ] 00:17:11.773 }' 00:17:11.773 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:11.773 11:58:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:12.711 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:12.711 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:12.711 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:12.711 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:12.711 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:12.711 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:12.711 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:12.711 11:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:12.711 [2024-07-15 11:58:26.181676] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:12.711 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:12.711 "name": "raid_bdev1", 00:17:12.711 "aliases": [ 00:17:12.711 "f2752fc0-a840-4b4e-b8b5-9a59467898fd" 00:17:12.711 ], 00:17:12.711 "product_name": "Raid Volume", 00:17:12.711 "block_size": 512, 00:17:12.711 "num_blocks": 190464, 00:17:12.711 "uuid": "f2752fc0-a840-4b4e-b8b5-9a59467898fd", 00:17:12.711 "assigned_rate_limits": { 00:17:12.711 "rw_ios_per_sec": 0, 00:17:12.711 "rw_mbytes_per_sec": 0, 00:17:12.711 "r_mbytes_per_sec": 0, 00:17:12.711 "w_mbytes_per_sec": 0 00:17:12.711 }, 00:17:12.711 "claimed": false, 00:17:12.711 "zoned": false, 00:17:12.711 "supported_io_types": { 00:17:12.711 "read": true, 00:17:12.711 "write": true, 00:17:12.711 "unmap": true, 00:17:12.711 "flush": true, 00:17:12.711 "reset": true, 00:17:12.711 "nvme_admin": false, 00:17:12.711 "nvme_io": false, 00:17:12.711 "nvme_io_md": false, 00:17:12.711 "write_zeroes": true, 00:17:12.711 "zcopy": false, 00:17:12.711 "get_zone_info": false, 00:17:12.711 "zone_management": false, 00:17:12.711 "zone_append": false, 00:17:12.711 "compare": false, 00:17:12.711 "compare_and_write": false, 00:17:12.711 "abort": false, 00:17:12.711 "seek_hole": false, 00:17:12.711 "seek_data": false, 00:17:12.711 "copy": false, 00:17:12.711 "nvme_iov_md": false 00:17:12.711 }, 00:17:12.711 "memory_domains": [ 00:17:12.711 { 00:17:12.711 "dma_device_id": "system", 00:17:12.711 "dma_device_type": 1 00:17:12.711 }, 00:17:12.711 { 00:17:12.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.711 "dma_device_type": 2 00:17:12.711 }, 00:17:12.711 { 00:17:12.711 "dma_device_id": "system", 00:17:12.711 "dma_device_type": 1 00:17:12.711 }, 00:17:12.711 { 00:17:12.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.711 "dma_device_type": 2 00:17:12.711 }, 00:17:12.711 { 00:17:12.711 "dma_device_id": "system", 00:17:12.711 "dma_device_type": 1 00:17:12.711 }, 00:17:12.711 { 00:17:12.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.711 "dma_device_type": 2 00:17:12.711 } 00:17:12.711 ], 00:17:12.711 "driver_specific": { 00:17:12.711 "raid": { 00:17:12.711 "uuid": "f2752fc0-a840-4b4e-b8b5-9a59467898fd", 00:17:12.711 "strip_size_kb": 64, 00:17:12.711 "state": "online", 00:17:12.711 "raid_level": "concat", 00:17:12.711 "superblock": true, 00:17:12.711 "num_base_bdevs": 3, 00:17:12.711 "num_base_bdevs_discovered": 3, 00:17:12.711 "num_base_bdevs_operational": 3, 00:17:12.711 "base_bdevs_list": [ 00:17:12.711 { 00:17:12.711 "name": "pt1", 00:17:12.711 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:12.711 "is_configured": true, 00:17:12.711 "data_offset": 2048, 00:17:12.711 "data_size": 63488 00:17:12.711 }, 00:17:12.711 { 00:17:12.711 "name": "pt2", 00:17:12.711 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:12.711 "is_configured": true, 00:17:12.711 "data_offset": 2048, 00:17:12.711 "data_size": 63488 00:17:12.711 }, 00:17:12.711 { 00:17:12.711 "name": "pt3", 00:17:12.711 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:12.711 "is_configured": true, 00:17:12.711 "data_offset": 2048, 00:17:12.711 "data_size": 63488 00:17:12.711 } 00:17:12.711 ] 00:17:12.711 } 00:17:12.711 } 00:17:12.711 }' 00:17:12.711 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:12.711 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:12.711 pt2 00:17:12.711 pt3' 00:17:12.711 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:12.711 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:12.711 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:12.971 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:12.971 "name": "pt1", 00:17:12.971 "aliases": [ 00:17:12.971 "00000000-0000-0000-0000-000000000001" 00:17:12.971 ], 00:17:12.971 "product_name": "passthru", 00:17:12.971 "block_size": 512, 00:17:12.971 "num_blocks": 65536, 00:17:12.971 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:12.971 "assigned_rate_limits": { 00:17:12.971 "rw_ios_per_sec": 0, 00:17:12.971 "rw_mbytes_per_sec": 0, 00:17:12.971 "r_mbytes_per_sec": 0, 00:17:12.971 "w_mbytes_per_sec": 0 00:17:12.971 }, 00:17:12.971 "claimed": true, 00:17:12.971 "claim_type": "exclusive_write", 00:17:12.971 "zoned": false, 00:17:12.971 "supported_io_types": { 00:17:12.971 "read": true, 00:17:12.971 "write": true, 00:17:12.971 "unmap": true, 00:17:12.971 "flush": true, 00:17:12.971 "reset": true, 00:17:12.971 "nvme_admin": false, 00:17:12.971 "nvme_io": false, 00:17:12.971 "nvme_io_md": false, 00:17:12.971 "write_zeroes": true, 00:17:12.971 "zcopy": true, 00:17:12.971 "get_zone_info": false, 00:17:12.971 "zone_management": false, 00:17:12.971 "zone_append": false, 00:17:12.971 "compare": false, 00:17:12.971 "compare_and_write": false, 00:17:12.971 "abort": true, 00:17:12.971 "seek_hole": false, 00:17:12.971 "seek_data": false, 00:17:12.971 "copy": true, 00:17:12.971 "nvme_iov_md": false 00:17:12.971 }, 00:17:12.971 "memory_domains": [ 00:17:12.971 { 00:17:12.971 "dma_device_id": "system", 00:17:12.971 "dma_device_type": 1 00:17:12.971 }, 00:17:12.971 { 00:17:12.971 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:12.971 "dma_device_type": 2 00:17:12.971 } 00:17:12.971 ], 00:17:12.971 "driver_specific": { 00:17:12.971 "passthru": { 00:17:12.971 "name": "pt1", 00:17:12.971 "base_bdev_name": "malloc1" 00:17:12.971 } 00:17:12.971 } 00:17:12.971 }' 00:17:12.971 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.971 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:12.971 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:12.971 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:13.230 11:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:13.490 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:13.490 "name": "pt2", 00:17:13.490 "aliases": [ 00:17:13.490 "00000000-0000-0000-0000-000000000002" 00:17:13.490 ], 00:17:13.490 "product_name": "passthru", 00:17:13.490 "block_size": 512, 00:17:13.490 "num_blocks": 65536, 00:17:13.490 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:13.490 "assigned_rate_limits": { 00:17:13.490 "rw_ios_per_sec": 0, 00:17:13.490 "rw_mbytes_per_sec": 0, 00:17:13.490 "r_mbytes_per_sec": 0, 00:17:13.490 "w_mbytes_per_sec": 0 00:17:13.490 }, 00:17:13.490 "claimed": true, 00:17:13.490 "claim_type": "exclusive_write", 00:17:13.490 "zoned": false, 00:17:13.490 "supported_io_types": { 00:17:13.490 "read": true, 00:17:13.490 "write": true, 00:17:13.490 "unmap": true, 00:17:13.490 "flush": true, 00:17:13.490 "reset": true, 00:17:13.490 "nvme_admin": false, 00:17:13.490 "nvme_io": false, 00:17:13.490 "nvme_io_md": false, 00:17:13.490 "write_zeroes": true, 00:17:13.490 "zcopy": true, 00:17:13.490 "get_zone_info": false, 00:17:13.490 "zone_management": false, 00:17:13.490 "zone_append": false, 00:17:13.490 "compare": false, 00:17:13.490 "compare_and_write": false, 00:17:13.490 "abort": true, 00:17:13.490 "seek_hole": false, 00:17:13.490 "seek_data": false, 00:17:13.490 "copy": true, 00:17:13.490 "nvme_iov_md": false 00:17:13.490 }, 00:17:13.490 "memory_domains": [ 00:17:13.490 { 00:17:13.490 "dma_device_id": "system", 00:17:13.490 "dma_device_type": 1 00:17:13.490 }, 00:17:13.490 { 00:17:13.490 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:13.490 "dma_device_type": 2 00:17:13.490 } 00:17:13.490 ], 00:17:13.490 "driver_specific": { 00:17:13.490 "passthru": { 00:17:13.490 "name": "pt2", 00:17:13.490 "base_bdev_name": "malloc2" 00:17:13.490 } 00:17:13.490 } 00:17:13.490 }' 00:17:13.490 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:13.490 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:13.749 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:13.749 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.749 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:13.749 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:13.749 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.749 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:13.749 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:13.749 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.008 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.008 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.008 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:14.008 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:14.008 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:14.268 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:14.268 "name": "pt3", 00:17:14.268 "aliases": [ 00:17:14.268 "00000000-0000-0000-0000-000000000003" 00:17:14.268 ], 00:17:14.268 "product_name": "passthru", 00:17:14.268 "block_size": 512, 00:17:14.268 "num_blocks": 65536, 00:17:14.268 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:14.268 "assigned_rate_limits": { 00:17:14.268 "rw_ios_per_sec": 0, 00:17:14.268 "rw_mbytes_per_sec": 0, 00:17:14.268 "r_mbytes_per_sec": 0, 00:17:14.268 "w_mbytes_per_sec": 0 00:17:14.268 }, 00:17:14.268 "claimed": true, 00:17:14.268 "claim_type": "exclusive_write", 00:17:14.268 "zoned": false, 00:17:14.268 "supported_io_types": { 00:17:14.268 "read": true, 00:17:14.268 "write": true, 00:17:14.268 "unmap": true, 00:17:14.268 "flush": true, 00:17:14.268 "reset": true, 00:17:14.268 "nvme_admin": false, 00:17:14.268 "nvme_io": false, 00:17:14.268 "nvme_io_md": false, 00:17:14.268 "write_zeroes": true, 00:17:14.268 "zcopy": true, 00:17:14.268 "get_zone_info": false, 00:17:14.268 "zone_management": false, 00:17:14.268 "zone_append": false, 00:17:14.268 "compare": false, 00:17:14.268 "compare_and_write": false, 00:17:14.268 "abort": true, 00:17:14.268 "seek_hole": false, 00:17:14.268 "seek_data": false, 00:17:14.268 "copy": true, 00:17:14.268 "nvme_iov_md": false 00:17:14.268 }, 00:17:14.268 "memory_domains": [ 00:17:14.268 { 00:17:14.268 "dma_device_id": "system", 00:17:14.268 "dma_device_type": 1 00:17:14.268 }, 00:17:14.268 { 00:17:14.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:14.268 "dma_device_type": 2 00:17:14.268 } 00:17:14.268 ], 00:17:14.268 "driver_specific": { 00:17:14.268 "passthru": { 00:17:14.268 "name": "pt3", 00:17:14.268 "base_bdev_name": "malloc3" 00:17:14.268 } 00:17:14.268 } 00:17:14.268 }' 00:17:14.268 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:14.268 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:14.268 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:14.268 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:14.268 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:14.268 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:14.268 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:14.268 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:14.528 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:14.528 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.528 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:14.528 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:14.528 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:14.528 11:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:14.787 [2024-07-15 11:58:28.199026] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:14.787 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=f2752fc0-a840-4b4e-b8b5-9a59467898fd 00:17:14.787 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z f2752fc0-a840-4b4e-b8b5-9a59467898fd ']' 00:17:14.787 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:15.047 [2024-07-15 11:58:28.491514] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:15.047 [2024-07-15 11:58:28.491537] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:15.047 [2024-07-15 11:58:28.491584] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:15.047 [2024-07-15 11:58:28.491636] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:15.047 [2024-07-15 11:58:28.491648] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c1d510 name raid_bdev1, state offline 00:17:15.047 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.047 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:15.307 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:15.307 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:15.307 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:15.307 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:15.567 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:15.567 11:58:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:15.826 11:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:15.826 11:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:16.085 11:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:16.085 11:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:16.345 11:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:16.345 11:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:16.345 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:17:16.345 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:16.346 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:16.346 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:16.346 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:16.346 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:16.346 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:16.346 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:17:16.346 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:16.346 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:16.346 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:17:16.605 [2024-07-15 11:58:29.951304] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:16.605 [2024-07-15 11:58:29.952642] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:16.605 [2024-07-15 11:58:29.952696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:16.606 [2024-07-15 11:58:29.952742] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:16.606 [2024-07-15 11:58:29.952780] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:16.606 [2024-07-15 11:58:29.952804] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:16.606 [2024-07-15 11:58:29.952822] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:16.606 [2024-07-15 11:58:29.952831] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c20500 name raid_bdev1, state configuring 00:17:16.606 request: 00:17:16.606 { 00:17:16.606 "name": "raid_bdev1", 00:17:16.606 "raid_level": "concat", 00:17:16.606 "base_bdevs": [ 00:17:16.606 "malloc1", 00:17:16.606 "malloc2", 00:17:16.606 "malloc3" 00:17:16.606 ], 00:17:16.606 "strip_size_kb": 64, 00:17:16.606 "superblock": false, 00:17:16.606 "method": "bdev_raid_create", 00:17:16.606 "req_id": 1 00:17:16.606 } 00:17:16.606 Got JSON-RPC error response 00:17:16.606 response: 00:17:16.606 { 00:17:16.606 "code": -17, 00:17:16.606 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:16.606 } 00:17:16.606 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:17:16.606 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:17:16.606 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:17:16.606 11:58:29 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:17:16.606 11:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.606 11:58:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:16.865 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:16.865 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:16.865 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:16.865 [2024-07-15 11:58:30.440734] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:16.865 [2024-07-15 11:58:30.440784] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:16.865 [2024-07-15 11:58:30.440803] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c1e210 00:17:16.865 [2024-07-15 11:58:30.440815] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:16.865 [2024-07-15 11:58:30.442430] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:16.865 [2024-07-15 11:58:30.442458] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:16.865 [2024-07-15 11:58:30.442526] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:16.865 [2024-07-15 11:58:30.442551] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:16.865 pt1 00:17:16.865 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:17:16.865 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:16.865 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:16.865 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:16.865 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:17.125 "name": "raid_bdev1", 00:17:17.125 "uuid": "f2752fc0-a840-4b4e-b8b5-9a59467898fd", 00:17:17.125 "strip_size_kb": 64, 00:17:17.125 "state": "configuring", 00:17:17.125 "raid_level": "concat", 00:17:17.125 "superblock": true, 00:17:17.125 "num_base_bdevs": 3, 00:17:17.125 "num_base_bdevs_discovered": 1, 00:17:17.125 "num_base_bdevs_operational": 3, 00:17:17.125 "base_bdevs_list": [ 00:17:17.125 { 00:17:17.125 "name": "pt1", 00:17:17.125 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:17.125 "is_configured": true, 00:17:17.125 "data_offset": 2048, 00:17:17.125 "data_size": 63488 00:17:17.125 }, 00:17:17.125 { 00:17:17.125 "name": null, 00:17:17.125 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:17.125 "is_configured": false, 00:17:17.125 "data_offset": 2048, 00:17:17.125 "data_size": 63488 00:17:17.125 }, 00:17:17.125 { 00:17:17.125 "name": null, 00:17:17.125 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:17.125 "is_configured": false, 00:17:17.125 "data_offset": 2048, 00:17:17.125 "data_size": 63488 00:17:17.125 } 00:17:17.125 ] 00:17:17.125 }' 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:17.125 11:58:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:18.064 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:17:18.064 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:18.064 [2024-07-15 11:58:31.535640] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:18.064 [2024-07-15 11:58:31.535699] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.064 [2024-07-15 11:58:31.535720] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c20500 00:17:18.064 [2024-07-15 11:58:31.535733] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.064 [2024-07-15 11:58:31.536068] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.064 [2024-07-15 11:58:31.536085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:18.064 [2024-07-15 11:58:31.536150] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:18.064 [2024-07-15 11:58:31.536169] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:18.064 pt2 00:17:18.064 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:18.323 [2024-07-15 11:58:31.776291] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.323 11:58:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:18.582 11:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.582 "name": "raid_bdev1", 00:17:18.582 "uuid": "f2752fc0-a840-4b4e-b8b5-9a59467898fd", 00:17:18.582 "strip_size_kb": 64, 00:17:18.582 "state": "configuring", 00:17:18.582 "raid_level": "concat", 00:17:18.582 "superblock": true, 00:17:18.582 "num_base_bdevs": 3, 00:17:18.582 "num_base_bdevs_discovered": 1, 00:17:18.582 "num_base_bdevs_operational": 3, 00:17:18.582 "base_bdevs_list": [ 00:17:18.582 { 00:17:18.582 "name": "pt1", 00:17:18.582 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:18.582 "is_configured": true, 00:17:18.582 "data_offset": 2048, 00:17:18.582 "data_size": 63488 00:17:18.582 }, 00:17:18.582 { 00:17:18.582 "name": null, 00:17:18.582 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:18.582 "is_configured": false, 00:17:18.582 "data_offset": 2048, 00:17:18.582 "data_size": 63488 00:17:18.582 }, 00:17:18.582 { 00:17:18.582 "name": null, 00:17:18.582 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:18.582 "is_configured": false, 00:17:18.582 "data_offset": 2048, 00:17:18.582 "data_size": 63488 00:17:18.582 } 00:17:18.582 ] 00:17:18.582 }' 00:17:18.582 11:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.582 11:58:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.150 11:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:19.150 11:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:19.150 11:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:19.409 [2024-07-15 11:58:32.859149] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:19.409 [2024-07-15 11:58:32.859208] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:19.409 [2024-07-15 11:58:32.859227] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b809c0 00:17:19.409 [2024-07-15 11:58:32.859239] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:19.409 [2024-07-15 11:58:32.859582] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:19.409 [2024-07-15 11:58:32.859599] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:19.409 [2024-07-15 11:58:32.859663] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:19.409 [2024-07-15 11:58:32.859682] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:19.409 pt2 00:17:19.409 11:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:19.409 11:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:19.409 11:58:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:19.668 [2024-07-15 11:58:33.107824] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:19.668 [2024-07-15 11:58:33.107870] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:19.668 [2024-07-15 11:58:33.107891] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b810b0 00:17:19.668 [2024-07-15 11:58:33.107903] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:19.668 [2024-07-15 11:58:33.108226] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:19.668 [2024-07-15 11:58:33.108244] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:19.668 [2024-07-15 11:58:33.108303] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:19.668 [2024-07-15 11:58:33.108322] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:19.668 [2024-07-15 11:58:33.108428] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c1f030 00:17:19.668 [2024-07-15 11:58:33.108438] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:19.668 [2024-07-15 11:58:33.108604] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c1fb70 00:17:19.668 [2024-07-15 11:58:33.108737] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c1f030 00:17:19.668 [2024-07-15 11:58:33.108748] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c1f030 00:17:19.668 [2024-07-15 11:58:33.108848] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:19.668 pt3 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.668 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:19.926 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.926 "name": "raid_bdev1", 00:17:19.926 "uuid": "f2752fc0-a840-4b4e-b8b5-9a59467898fd", 00:17:19.927 "strip_size_kb": 64, 00:17:19.927 "state": "online", 00:17:19.927 "raid_level": "concat", 00:17:19.927 "superblock": true, 00:17:19.927 "num_base_bdevs": 3, 00:17:19.927 "num_base_bdevs_discovered": 3, 00:17:19.927 "num_base_bdevs_operational": 3, 00:17:19.927 "base_bdevs_list": [ 00:17:19.927 { 00:17:19.927 "name": "pt1", 00:17:19.927 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:19.927 "is_configured": true, 00:17:19.927 "data_offset": 2048, 00:17:19.927 "data_size": 63488 00:17:19.927 }, 00:17:19.927 { 00:17:19.927 "name": "pt2", 00:17:19.927 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:19.927 "is_configured": true, 00:17:19.927 "data_offset": 2048, 00:17:19.927 "data_size": 63488 00:17:19.927 }, 00:17:19.927 { 00:17:19.927 "name": "pt3", 00:17:19.927 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:19.927 "is_configured": true, 00:17:19.927 "data_offset": 2048, 00:17:19.927 "data_size": 63488 00:17:19.927 } 00:17:19.927 ] 00:17:19.927 }' 00:17:19.927 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.927 11:58:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:20.492 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:20.492 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:20.492 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:20.492 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:20.492 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:20.492 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:20.492 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:20.492 11:58:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:20.751 [2024-07-15 11:58:34.186943] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:20.751 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:20.751 "name": "raid_bdev1", 00:17:20.751 "aliases": [ 00:17:20.751 "f2752fc0-a840-4b4e-b8b5-9a59467898fd" 00:17:20.751 ], 00:17:20.751 "product_name": "Raid Volume", 00:17:20.751 "block_size": 512, 00:17:20.751 "num_blocks": 190464, 00:17:20.751 "uuid": "f2752fc0-a840-4b4e-b8b5-9a59467898fd", 00:17:20.751 "assigned_rate_limits": { 00:17:20.751 "rw_ios_per_sec": 0, 00:17:20.751 "rw_mbytes_per_sec": 0, 00:17:20.751 "r_mbytes_per_sec": 0, 00:17:20.751 "w_mbytes_per_sec": 0 00:17:20.751 }, 00:17:20.751 "claimed": false, 00:17:20.751 "zoned": false, 00:17:20.751 "supported_io_types": { 00:17:20.751 "read": true, 00:17:20.751 "write": true, 00:17:20.751 "unmap": true, 00:17:20.751 "flush": true, 00:17:20.751 "reset": true, 00:17:20.751 "nvme_admin": false, 00:17:20.751 "nvme_io": false, 00:17:20.751 "nvme_io_md": false, 00:17:20.751 "write_zeroes": true, 00:17:20.751 "zcopy": false, 00:17:20.751 "get_zone_info": false, 00:17:20.751 "zone_management": false, 00:17:20.751 "zone_append": false, 00:17:20.751 "compare": false, 00:17:20.751 "compare_and_write": false, 00:17:20.751 "abort": false, 00:17:20.751 "seek_hole": false, 00:17:20.751 "seek_data": false, 00:17:20.751 "copy": false, 00:17:20.751 "nvme_iov_md": false 00:17:20.751 }, 00:17:20.751 "memory_domains": [ 00:17:20.751 { 00:17:20.751 "dma_device_id": "system", 00:17:20.751 "dma_device_type": 1 00:17:20.751 }, 00:17:20.751 { 00:17:20.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.751 "dma_device_type": 2 00:17:20.751 }, 00:17:20.751 { 00:17:20.751 "dma_device_id": "system", 00:17:20.751 "dma_device_type": 1 00:17:20.751 }, 00:17:20.751 { 00:17:20.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.751 "dma_device_type": 2 00:17:20.751 }, 00:17:20.751 { 00:17:20.751 "dma_device_id": "system", 00:17:20.751 "dma_device_type": 1 00:17:20.751 }, 00:17:20.751 { 00:17:20.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:20.751 "dma_device_type": 2 00:17:20.751 } 00:17:20.751 ], 00:17:20.751 "driver_specific": { 00:17:20.751 "raid": { 00:17:20.751 "uuid": "f2752fc0-a840-4b4e-b8b5-9a59467898fd", 00:17:20.751 "strip_size_kb": 64, 00:17:20.751 "state": "online", 00:17:20.751 "raid_level": "concat", 00:17:20.751 "superblock": true, 00:17:20.751 "num_base_bdevs": 3, 00:17:20.751 "num_base_bdevs_discovered": 3, 00:17:20.751 "num_base_bdevs_operational": 3, 00:17:20.751 "base_bdevs_list": [ 00:17:20.751 { 00:17:20.751 "name": "pt1", 00:17:20.751 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:20.751 "is_configured": true, 00:17:20.751 "data_offset": 2048, 00:17:20.751 "data_size": 63488 00:17:20.751 }, 00:17:20.751 { 00:17:20.751 "name": "pt2", 00:17:20.751 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:20.751 "is_configured": true, 00:17:20.751 "data_offset": 2048, 00:17:20.751 "data_size": 63488 00:17:20.751 }, 00:17:20.751 { 00:17:20.751 "name": "pt3", 00:17:20.751 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:20.751 "is_configured": true, 00:17:20.751 "data_offset": 2048, 00:17:20.751 "data_size": 63488 00:17:20.751 } 00:17:20.751 ] 00:17:20.751 } 00:17:20.751 } 00:17:20.751 }' 00:17:20.751 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:20.751 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:20.751 pt2 00:17:20.751 pt3' 00:17:20.751 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:20.751 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:20.751 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:21.009 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:21.009 "name": "pt1", 00:17:21.009 "aliases": [ 00:17:21.009 "00000000-0000-0000-0000-000000000001" 00:17:21.009 ], 00:17:21.009 "product_name": "passthru", 00:17:21.009 "block_size": 512, 00:17:21.009 "num_blocks": 65536, 00:17:21.009 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:21.009 "assigned_rate_limits": { 00:17:21.009 "rw_ios_per_sec": 0, 00:17:21.009 "rw_mbytes_per_sec": 0, 00:17:21.009 "r_mbytes_per_sec": 0, 00:17:21.009 "w_mbytes_per_sec": 0 00:17:21.009 }, 00:17:21.009 "claimed": true, 00:17:21.009 "claim_type": "exclusive_write", 00:17:21.009 "zoned": false, 00:17:21.009 "supported_io_types": { 00:17:21.009 "read": true, 00:17:21.009 "write": true, 00:17:21.009 "unmap": true, 00:17:21.009 "flush": true, 00:17:21.009 "reset": true, 00:17:21.009 "nvme_admin": false, 00:17:21.009 "nvme_io": false, 00:17:21.009 "nvme_io_md": false, 00:17:21.009 "write_zeroes": true, 00:17:21.009 "zcopy": true, 00:17:21.009 "get_zone_info": false, 00:17:21.009 "zone_management": false, 00:17:21.009 "zone_append": false, 00:17:21.009 "compare": false, 00:17:21.009 "compare_and_write": false, 00:17:21.009 "abort": true, 00:17:21.009 "seek_hole": false, 00:17:21.009 "seek_data": false, 00:17:21.009 "copy": true, 00:17:21.009 "nvme_iov_md": false 00:17:21.009 }, 00:17:21.009 "memory_domains": [ 00:17:21.009 { 00:17:21.009 "dma_device_id": "system", 00:17:21.009 "dma_device_type": 1 00:17:21.009 }, 00:17:21.009 { 00:17:21.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.009 "dma_device_type": 2 00:17:21.009 } 00:17:21.009 ], 00:17:21.009 "driver_specific": { 00:17:21.009 "passthru": { 00:17:21.009 "name": "pt1", 00:17:21.009 "base_bdev_name": "malloc1" 00:17:21.009 } 00:17:21.009 } 00:17:21.009 }' 00:17:21.009 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.009 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.009 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:21.009 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:21.268 11:58:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:21.527 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:21.527 "name": "pt2", 00:17:21.527 "aliases": [ 00:17:21.527 "00000000-0000-0000-0000-000000000002" 00:17:21.527 ], 00:17:21.527 "product_name": "passthru", 00:17:21.527 "block_size": 512, 00:17:21.527 "num_blocks": 65536, 00:17:21.527 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:21.527 "assigned_rate_limits": { 00:17:21.527 "rw_ios_per_sec": 0, 00:17:21.527 "rw_mbytes_per_sec": 0, 00:17:21.527 "r_mbytes_per_sec": 0, 00:17:21.527 "w_mbytes_per_sec": 0 00:17:21.527 }, 00:17:21.527 "claimed": true, 00:17:21.527 "claim_type": "exclusive_write", 00:17:21.527 "zoned": false, 00:17:21.527 "supported_io_types": { 00:17:21.527 "read": true, 00:17:21.527 "write": true, 00:17:21.527 "unmap": true, 00:17:21.527 "flush": true, 00:17:21.527 "reset": true, 00:17:21.527 "nvme_admin": false, 00:17:21.527 "nvme_io": false, 00:17:21.527 "nvme_io_md": false, 00:17:21.527 "write_zeroes": true, 00:17:21.527 "zcopy": true, 00:17:21.527 "get_zone_info": false, 00:17:21.527 "zone_management": false, 00:17:21.527 "zone_append": false, 00:17:21.527 "compare": false, 00:17:21.527 "compare_and_write": false, 00:17:21.527 "abort": true, 00:17:21.527 "seek_hole": false, 00:17:21.527 "seek_data": false, 00:17:21.527 "copy": true, 00:17:21.527 "nvme_iov_md": false 00:17:21.527 }, 00:17:21.527 "memory_domains": [ 00:17:21.527 { 00:17:21.527 "dma_device_id": "system", 00:17:21.527 "dma_device_type": 1 00:17:21.527 }, 00:17:21.527 { 00:17:21.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:21.527 "dma_device_type": 2 00:17:21.527 } 00:17:21.527 ], 00:17:21.527 "driver_specific": { 00:17:21.527 "passthru": { 00:17:21.527 "name": "pt2", 00:17:21.527 "base_bdev_name": "malloc2" 00:17:21.527 } 00:17:21.527 } 00:17:21.527 }' 00:17:21.527 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.785 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:21.785 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:21.785 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.785 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:21.785 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:21.785 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.785 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:21.785 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:21.785 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.044 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.044 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:22.044 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:22.044 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:22.044 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:22.303 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:22.303 "name": "pt3", 00:17:22.303 "aliases": [ 00:17:22.303 "00000000-0000-0000-0000-000000000003" 00:17:22.303 ], 00:17:22.303 "product_name": "passthru", 00:17:22.303 "block_size": 512, 00:17:22.303 "num_blocks": 65536, 00:17:22.303 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:22.303 "assigned_rate_limits": { 00:17:22.303 "rw_ios_per_sec": 0, 00:17:22.303 "rw_mbytes_per_sec": 0, 00:17:22.303 "r_mbytes_per_sec": 0, 00:17:22.303 "w_mbytes_per_sec": 0 00:17:22.303 }, 00:17:22.303 "claimed": true, 00:17:22.303 "claim_type": "exclusive_write", 00:17:22.303 "zoned": false, 00:17:22.303 "supported_io_types": { 00:17:22.303 "read": true, 00:17:22.303 "write": true, 00:17:22.303 "unmap": true, 00:17:22.303 "flush": true, 00:17:22.303 "reset": true, 00:17:22.303 "nvme_admin": false, 00:17:22.303 "nvme_io": false, 00:17:22.303 "nvme_io_md": false, 00:17:22.303 "write_zeroes": true, 00:17:22.303 "zcopy": true, 00:17:22.303 "get_zone_info": false, 00:17:22.303 "zone_management": false, 00:17:22.303 "zone_append": false, 00:17:22.303 "compare": false, 00:17:22.303 "compare_and_write": false, 00:17:22.303 "abort": true, 00:17:22.303 "seek_hole": false, 00:17:22.303 "seek_data": false, 00:17:22.303 "copy": true, 00:17:22.303 "nvme_iov_md": false 00:17:22.303 }, 00:17:22.303 "memory_domains": [ 00:17:22.303 { 00:17:22.303 "dma_device_id": "system", 00:17:22.303 "dma_device_type": 1 00:17:22.303 }, 00:17:22.303 { 00:17:22.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.303 "dma_device_type": 2 00:17:22.303 } 00:17:22.303 ], 00:17:22.303 "driver_specific": { 00:17:22.303 "passthru": { 00:17:22.303 "name": "pt3", 00:17:22.303 "base_bdev_name": "malloc3" 00:17:22.303 } 00:17:22.303 } 00:17:22.303 }' 00:17:22.303 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.303 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:22.303 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:22.303 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.303 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:22.303 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:22.303 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.303 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:22.561 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:22.561 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.561 11:58:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:22.561 11:58:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:22.561 11:58:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:22.561 11:58:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:22.820 [2024-07-15 11:58:36.168186] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' f2752fc0-a840-4b4e-b8b5-9a59467898fd '!=' f2752fc0-a840-4b4e-b8b5-9a59467898fd ']' 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1498788 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1498788 ']' 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1498788 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1498788 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1498788' 00:17:22.820 killing process with pid 1498788 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1498788 00:17:22.820 [2024-07-15 11:58:36.237809] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:22.820 [2024-07-15 11:58:36.237866] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:22.820 [2024-07-15 11:58:36.237920] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:22.820 [2024-07-15 11:58:36.237931] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c1f030 name raid_bdev1, state offline 00:17:22.820 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1498788 00:17:22.820 [2024-07-15 11:58:36.265381] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:23.080 11:58:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:23.080 00:17:23.080 real 0m14.376s 00:17:23.080 user 0m25.861s 00:17:23.080 sys 0m2.625s 00:17:23.080 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:23.080 11:58:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.080 ************************************ 00:17:23.080 END TEST raid_superblock_test 00:17:23.080 ************************************ 00:17:23.080 11:58:36 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:23.080 11:58:36 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:17:23.080 11:58:36 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:23.080 11:58:36 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:23.080 11:58:36 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:23.080 ************************************ 00:17:23.080 START TEST raid_read_error_test 00:17:23.080 ************************************ 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 read 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.dWvjLfCvsr 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1500910 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1500910 /var/tmp/spdk-raid.sock 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1500910 ']' 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:23.080 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:23.080 11:58:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:23.080 [2024-07-15 11:58:36.658664] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:17:23.080 [2024-07-15 11:58:36.658739] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1500910 ] 00:17:23.339 [2024-07-15 11:58:36.787671] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:23.339 [2024-07-15 11:58:36.893346] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:23.598 [2024-07-15 11:58:36.964576] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:23.598 [2024-07-15 11:58:36.964613] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:24.165 11:58:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:24.165 11:58:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:24.165 11:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:24.165 11:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:24.424 BaseBdev1_malloc 00:17:24.424 11:58:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:24.683 true 00:17:24.683 11:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:24.943 [2024-07-15 11:58:38.291056] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:24.943 [2024-07-15 11:58:38.291101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:24.943 [2024-07-15 11:58:38.291121] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14ac4e0 00:17:24.943 [2024-07-15 11:58:38.291133] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:24.943 [2024-07-15 11:58:38.292916] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:24.943 [2024-07-15 11:58:38.292945] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:24.943 BaseBdev1 00:17:24.943 11:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:24.943 11:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:25.202 BaseBdev2_malloc 00:17:25.202 11:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:25.202 true 00:17:25.461 11:58:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:25.461 [2024-07-15 11:58:39.026798] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:25.461 [2024-07-15 11:58:39.026841] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:25.461 [2024-07-15 11:58:39.026861] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b17b0 00:17:25.461 [2024-07-15 11:58:39.026873] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:25.461 [2024-07-15 11:58:39.028465] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:25.461 [2024-07-15 11:58:39.028494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:25.461 BaseBdev2 00:17:25.461 11:58:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:25.461 11:58:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:25.720 BaseBdev3_malloc 00:17:25.720 11:58:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:25.978 true 00:17:25.978 11:58:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:26.236 [2024-07-15 11:58:39.754539] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:26.236 [2024-07-15 11:58:39.754584] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:26.236 [2024-07-15 11:58:39.754606] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14b38f0 00:17:26.236 [2024-07-15 11:58:39.754618] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:26.236 [2024-07-15 11:58:39.756234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:26.236 [2024-07-15 11:58:39.756262] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:26.236 BaseBdev3 00:17:26.236 11:58:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:26.493 [2024-07-15 11:58:40.007235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:26.493 [2024-07-15 11:58:40.008480] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:26.493 [2024-07-15 11:58:40.008546] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:26.493 [2024-07-15 11:58:40.008763] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14b54b0 00:17:26.493 [2024-07-15 11:58:40.008775] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:26.493 [2024-07-15 11:58:40.008967] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14b75e0 00:17:26.493 [2024-07-15 11:58:40.009117] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14b54b0 00:17:26.493 [2024-07-15 11:58:40.009128] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14b54b0 00:17:26.493 [2024-07-15 11:58:40.009231] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.493 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:26.752 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.752 "name": "raid_bdev1", 00:17:26.752 "uuid": "5dc44bfd-25eb-4068-848e-f3dd17711230", 00:17:26.752 "strip_size_kb": 64, 00:17:26.752 "state": "online", 00:17:26.752 "raid_level": "concat", 00:17:26.752 "superblock": true, 00:17:26.752 "num_base_bdevs": 3, 00:17:26.752 "num_base_bdevs_discovered": 3, 00:17:26.752 "num_base_bdevs_operational": 3, 00:17:26.752 "base_bdevs_list": [ 00:17:26.752 { 00:17:26.752 "name": "BaseBdev1", 00:17:26.752 "uuid": "b576287a-f854-5a93-923c-1546b6584bde", 00:17:26.752 "is_configured": true, 00:17:26.752 "data_offset": 2048, 00:17:26.752 "data_size": 63488 00:17:26.752 }, 00:17:26.752 { 00:17:26.752 "name": "BaseBdev2", 00:17:26.752 "uuid": "cf97c9a3-ba10-5963-bfd0-a6b51815db4c", 00:17:26.752 "is_configured": true, 00:17:26.752 "data_offset": 2048, 00:17:26.752 "data_size": 63488 00:17:26.752 }, 00:17:26.752 { 00:17:26.752 "name": "BaseBdev3", 00:17:26.752 "uuid": "bcbd8594-91bc-5495-bf86-64682bb41e31", 00:17:26.752 "is_configured": true, 00:17:26.752 "data_offset": 2048, 00:17:26.752 "data_size": 63488 00:17:26.752 } 00:17:26.752 ] 00:17:26.752 }' 00:17:26.752 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.752 11:58:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.319 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:27.319 11:58:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:27.578 [2024-07-15 11:58:40.978067] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14ba110 00:17:28.515 11:58:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.515 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:28.773 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.773 "name": "raid_bdev1", 00:17:28.773 "uuid": "5dc44bfd-25eb-4068-848e-f3dd17711230", 00:17:28.773 "strip_size_kb": 64, 00:17:28.773 "state": "online", 00:17:28.773 "raid_level": "concat", 00:17:28.773 "superblock": true, 00:17:28.773 "num_base_bdevs": 3, 00:17:28.773 "num_base_bdevs_discovered": 3, 00:17:28.773 "num_base_bdevs_operational": 3, 00:17:28.773 "base_bdevs_list": [ 00:17:28.773 { 00:17:28.773 "name": "BaseBdev1", 00:17:28.773 "uuid": "b576287a-f854-5a93-923c-1546b6584bde", 00:17:28.773 "is_configured": true, 00:17:28.773 "data_offset": 2048, 00:17:28.773 "data_size": 63488 00:17:28.773 }, 00:17:28.773 { 00:17:28.773 "name": "BaseBdev2", 00:17:28.773 "uuid": "cf97c9a3-ba10-5963-bfd0-a6b51815db4c", 00:17:28.773 "is_configured": true, 00:17:28.773 "data_offset": 2048, 00:17:28.773 "data_size": 63488 00:17:28.773 }, 00:17:28.773 { 00:17:28.773 "name": "BaseBdev3", 00:17:28.773 "uuid": "bcbd8594-91bc-5495-bf86-64682bb41e31", 00:17:28.773 "is_configured": true, 00:17:28.773 "data_offset": 2048, 00:17:28.773 "data_size": 63488 00:17:28.773 } 00:17:28.773 ] 00:17:28.773 }' 00:17:28.773 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.773 11:58:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.339 11:58:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:29.597 [2024-07-15 11:58:43.118460] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:29.597 [2024-07-15 11:58:43.118499] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:29.597 [2024-07-15 11:58:43.121670] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:29.597 [2024-07-15 11:58:43.121715] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:29.597 [2024-07-15 11:58:43.121756] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:29.597 [2024-07-15 11:58:43.121767] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14b54b0 name raid_bdev1, state offline 00:17:29.597 0 00:17:29.597 11:58:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1500910 00:17:29.597 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1500910 ']' 00:17:29.597 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1500910 00:17:29.597 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:17:29.597 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:29.597 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1500910 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1500910' 00:17:29.887 killing process with pid 1500910 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1500910 00:17:29.887 [2024-07-15 11:58:43.199988] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1500910 00:17:29.887 [2024-07-15 11:58:43.221092] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.dWvjLfCvsr 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.47 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.47 != \0\.\0\0 ]] 00:17:29.887 00:17:29.887 real 0m6.883s 00:17:29.887 user 0m10.853s 00:17:29.887 sys 0m1.216s 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:29.887 11:58:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.887 ************************************ 00:17:29.887 END TEST raid_read_error_test 00:17:29.887 ************************************ 00:17:30.168 11:58:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:30.168 11:58:43 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:17:30.168 11:58:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:30.168 11:58:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:30.168 11:58:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:30.168 ************************************ 00:17:30.168 START TEST raid_write_error_test 00:17:30.168 ************************************ 00:17:30.168 11:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 3 write 00:17:30.168 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:17:30.168 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:17:30.168 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:30.168 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:30.168 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:30.168 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:30.168 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.rUsOeTkD2T 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1501932 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1501932 /var/tmp/spdk-raid.sock 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1501932 ']' 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:30.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:30.169 11:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.169 [2024-07-15 11:58:43.637915] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:17:30.169 [2024-07-15 11:58:43.637991] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1501932 ] 00:17:30.453 [2024-07-15 11:58:43.768917] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:30.453 [2024-07-15 11:58:43.870339] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:30.453 [2024-07-15 11:58:43.934045] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:30.453 [2024-07-15 11:58:43.934085] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:31.020 11:58:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:31.020 11:58:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:17:31.020 11:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:31.020 11:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:31.278 BaseBdev1_malloc 00:17:31.278 11:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:31.278 true 00:17:31.537 11:58:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:31.537 [2024-07-15 11:58:45.031288] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:31.537 [2024-07-15 11:58:45.031337] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:31.537 [2024-07-15 11:58:45.031356] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11cf4e0 00:17:31.537 [2024-07-15 11:58:45.031369] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:31.537 [2024-07-15 11:58:45.033060] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:31.537 [2024-07-15 11:58:45.033090] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:31.537 BaseBdev1 00:17:31.537 11:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:31.537 11:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:31.796 BaseBdev2_malloc 00:17:31.796 11:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:32.055 true 00:17:32.055 11:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:32.055 [2024-07-15 11:58:45.577324] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:32.055 [2024-07-15 11:58:45.577369] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:32.055 [2024-07-15 11:58:45.577388] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11d47b0 00:17:32.055 [2024-07-15 11:58:45.577400] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:32.055 [2024-07-15 11:58:45.578785] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:32.055 [2024-07-15 11:58:45.578812] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:32.055 BaseBdev2 00:17:32.055 11:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:32.055 11:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:32.313 BaseBdev3_malloc 00:17:32.313 11:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:32.571 true 00:17:32.571 11:58:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:32.571 [2024-07-15 11:58:46.115390] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:32.571 [2024-07-15 11:58:46.115432] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:32.571 [2024-07-15 11:58:46.115453] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11d68f0 00:17:32.572 [2024-07-15 11:58:46.115465] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:32.572 [2024-07-15 11:58:46.116926] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:32.572 [2024-07-15 11:58:46.116954] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:32.572 BaseBdev3 00:17:32.572 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:32.831 [2024-07-15 11:58:46.364089] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:32.831 [2024-07-15 11:58:46.365368] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:32.831 [2024-07-15 11:58:46.365437] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:32.831 [2024-07-15 11:58:46.365637] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11d84b0 00:17:32.831 [2024-07-15 11:58:46.365649] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:17:32.831 [2024-07-15 11:58:46.365854] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11da5e0 00:17:32.831 [2024-07-15 11:58:46.366001] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11d84b0 00:17:32.831 [2024-07-15 11:58:46.366011] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x11d84b0 00:17:32.831 [2024-07-15 11:58:46.366112] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:32.831 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:33.090 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.090 "name": "raid_bdev1", 00:17:33.090 "uuid": "2d75982a-6104-4049-b8ad-4654977262fc", 00:17:33.090 "strip_size_kb": 64, 00:17:33.090 "state": "online", 00:17:33.090 "raid_level": "concat", 00:17:33.090 "superblock": true, 00:17:33.090 "num_base_bdevs": 3, 00:17:33.090 "num_base_bdevs_discovered": 3, 00:17:33.090 "num_base_bdevs_operational": 3, 00:17:33.090 "base_bdevs_list": [ 00:17:33.090 { 00:17:33.090 "name": "BaseBdev1", 00:17:33.090 "uuid": "ab611709-fb92-516a-b126-2826bf8d6324", 00:17:33.090 "is_configured": true, 00:17:33.090 "data_offset": 2048, 00:17:33.090 "data_size": 63488 00:17:33.090 }, 00:17:33.090 { 00:17:33.090 "name": "BaseBdev2", 00:17:33.090 "uuid": "627fc94b-e6d3-5909-8f52-a0eab73cb589", 00:17:33.090 "is_configured": true, 00:17:33.090 "data_offset": 2048, 00:17:33.090 "data_size": 63488 00:17:33.090 }, 00:17:33.090 { 00:17:33.090 "name": "BaseBdev3", 00:17:33.090 "uuid": "eff74c59-4e49-57cf-9061-26b2fe07e13e", 00:17:33.090 "is_configured": true, 00:17:33.090 "data_offset": 2048, 00:17:33.090 "data_size": 63488 00:17:33.090 } 00:17:33.090 ] 00:17:33.090 }' 00:17:33.090 11:58:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.090 11:58:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.026 11:58:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:34.026 11:58:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:34.026 [2024-07-15 11:58:47.399112] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11dd110 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.963 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:35.222 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:35.222 "name": "raid_bdev1", 00:17:35.222 "uuid": "2d75982a-6104-4049-b8ad-4654977262fc", 00:17:35.222 "strip_size_kb": 64, 00:17:35.222 "state": "online", 00:17:35.222 "raid_level": "concat", 00:17:35.222 "superblock": true, 00:17:35.222 "num_base_bdevs": 3, 00:17:35.222 "num_base_bdevs_discovered": 3, 00:17:35.222 "num_base_bdevs_operational": 3, 00:17:35.222 "base_bdevs_list": [ 00:17:35.222 { 00:17:35.222 "name": "BaseBdev1", 00:17:35.222 "uuid": "ab611709-fb92-516a-b126-2826bf8d6324", 00:17:35.222 "is_configured": true, 00:17:35.222 "data_offset": 2048, 00:17:35.222 "data_size": 63488 00:17:35.222 }, 00:17:35.222 { 00:17:35.222 "name": "BaseBdev2", 00:17:35.222 "uuid": "627fc94b-e6d3-5909-8f52-a0eab73cb589", 00:17:35.222 "is_configured": true, 00:17:35.222 "data_offset": 2048, 00:17:35.222 "data_size": 63488 00:17:35.222 }, 00:17:35.222 { 00:17:35.222 "name": "BaseBdev3", 00:17:35.222 "uuid": "eff74c59-4e49-57cf-9061-26b2fe07e13e", 00:17:35.222 "is_configured": true, 00:17:35.222 "data_offset": 2048, 00:17:35.222 "data_size": 63488 00:17:35.222 } 00:17:35.222 ] 00:17:35.222 }' 00:17:35.222 11:58:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:35.222 11:58:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:35.790 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:36.050 [2024-07-15 11:58:49.446366] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:36.050 [2024-07-15 11:58:49.446411] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:36.050 [2024-07-15 11:58:49.449593] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:36.050 [2024-07-15 11:58:49.449632] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:36.050 [2024-07-15 11:58:49.449667] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:36.050 [2024-07-15 11:58:49.449679] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11d84b0 name raid_bdev1, state offline 00:17:36.050 0 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1501932 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1501932 ']' 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1501932 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1501932 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1501932' 00:17:36.050 killing process with pid 1501932 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1501932 00:17:36.050 [2024-07-15 11:58:49.529962] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:36.050 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1501932 00:17:36.050 [2024-07-15 11:58:49.551419] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.rUsOeTkD2T 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:17:36.311 00:17:36.311 real 0m6.239s 00:17:36.311 user 0m9.653s 00:17:36.311 sys 0m1.162s 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:17:36.311 11:58:49 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:36.311 ************************************ 00:17:36.311 END TEST raid_write_error_test 00:17:36.311 ************************************ 00:17:36.311 11:58:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:17:36.311 11:58:49 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:36.311 11:58:49 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:17:36.311 11:58:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:17:36.311 11:58:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:17:36.311 11:58:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:36.311 ************************************ 00:17:36.311 START TEST raid_state_function_test 00:17:36.311 ************************************ 00:17:36.311 11:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 false 00:17:36.311 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:17:36.311 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:17:36.311 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:36.311 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:36.311 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:36.311 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1502778 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1502778' 00:17:36.312 Process raid pid: 1502778 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1502778 /var/tmp/spdk-raid.sock 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1502778 ']' 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:36.312 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:17:36.312 11:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:36.572 [2024-07-15 11:58:49.948104] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:17:36.572 [2024-07-15 11:58:49.948175] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:36.572 [2024-07-15 11:58:50.082352] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:36.831 [2024-07-15 11:58:50.193393] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:36.831 [2024-07-15 11:58:50.260466] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:36.831 [2024-07-15 11:58:50.260493] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:37.400 11:58:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:17:37.400 11:58:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:17:37.400 11:58:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:37.659 [2024-07-15 11:58:51.119787] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:37.659 [2024-07-15 11:58:51.119827] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:37.659 [2024-07-15 11:58:51.119838] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:37.659 [2024-07-15 11:58:51.119850] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:37.659 [2024-07-15 11:58:51.119859] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:37.659 [2024-07-15 11:58:51.119869] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.659 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.918 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.918 "name": "Existed_Raid", 00:17:37.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.918 "strip_size_kb": 0, 00:17:37.918 "state": "configuring", 00:17:37.918 "raid_level": "raid1", 00:17:37.918 "superblock": false, 00:17:37.918 "num_base_bdevs": 3, 00:17:37.918 "num_base_bdevs_discovered": 0, 00:17:37.918 "num_base_bdevs_operational": 3, 00:17:37.918 "base_bdevs_list": [ 00:17:37.918 { 00:17:37.918 "name": "BaseBdev1", 00:17:37.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.918 "is_configured": false, 00:17:37.918 "data_offset": 0, 00:17:37.918 "data_size": 0 00:17:37.918 }, 00:17:37.918 { 00:17:37.918 "name": "BaseBdev2", 00:17:37.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.918 "is_configured": false, 00:17:37.918 "data_offset": 0, 00:17:37.918 "data_size": 0 00:17:37.918 }, 00:17:37.918 { 00:17:37.918 "name": "BaseBdev3", 00:17:37.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.918 "is_configured": false, 00:17:37.918 "data_offset": 0, 00:17:37.918 "data_size": 0 00:17:37.918 } 00:17:37.918 ] 00:17:37.918 }' 00:17:37.918 11:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.918 11:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.485 11:58:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:38.744 [2024-07-15 11:58:52.238788] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:38.744 [2024-07-15 11:58:52.238818] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe08b00 name Existed_Raid, state configuring 00:17:38.744 11:58:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:39.003 [2024-07-15 11:58:52.495474] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:39.003 [2024-07-15 11:58:52.495506] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:39.003 [2024-07-15 11:58:52.495515] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:39.003 [2024-07-15 11:58:52.495526] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:39.003 [2024-07-15 11:58:52.495535] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:39.003 [2024-07-15 11:58:52.495546] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:39.003 11:58:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:39.261 [2024-07-15 11:58:52.766091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:39.261 BaseBdev1 00:17:39.261 11:58:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:39.261 11:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:39.261 11:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:39.262 11:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:39.262 11:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:39.262 11:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:39.262 11:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:39.520 11:58:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:39.787 [ 00:17:39.787 { 00:17:39.787 "name": "BaseBdev1", 00:17:39.787 "aliases": [ 00:17:39.787 "de545d80-626f-4d56-acc1-abeb7310529a" 00:17:39.787 ], 00:17:39.787 "product_name": "Malloc disk", 00:17:39.787 "block_size": 512, 00:17:39.787 "num_blocks": 65536, 00:17:39.787 "uuid": "de545d80-626f-4d56-acc1-abeb7310529a", 00:17:39.787 "assigned_rate_limits": { 00:17:39.787 "rw_ios_per_sec": 0, 00:17:39.787 "rw_mbytes_per_sec": 0, 00:17:39.787 "r_mbytes_per_sec": 0, 00:17:39.787 "w_mbytes_per_sec": 0 00:17:39.787 }, 00:17:39.788 "claimed": true, 00:17:39.788 "claim_type": "exclusive_write", 00:17:39.788 "zoned": false, 00:17:39.788 "supported_io_types": { 00:17:39.788 "read": true, 00:17:39.788 "write": true, 00:17:39.788 "unmap": true, 00:17:39.788 "flush": true, 00:17:39.788 "reset": true, 00:17:39.788 "nvme_admin": false, 00:17:39.788 "nvme_io": false, 00:17:39.788 "nvme_io_md": false, 00:17:39.788 "write_zeroes": true, 00:17:39.788 "zcopy": true, 00:17:39.788 "get_zone_info": false, 00:17:39.788 "zone_management": false, 00:17:39.788 "zone_append": false, 00:17:39.788 "compare": false, 00:17:39.788 "compare_and_write": false, 00:17:39.788 "abort": true, 00:17:39.788 "seek_hole": false, 00:17:39.788 "seek_data": false, 00:17:39.788 "copy": true, 00:17:39.788 "nvme_iov_md": false 00:17:39.788 }, 00:17:39.788 "memory_domains": [ 00:17:39.788 { 00:17:39.788 "dma_device_id": "system", 00:17:39.788 "dma_device_type": 1 00:17:39.788 }, 00:17:39.788 { 00:17:39.788 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.788 "dma_device_type": 2 00:17:39.788 } 00:17:39.788 ], 00:17:39.788 "driver_specific": {} 00:17:39.788 } 00:17:39.788 ] 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.788 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.046 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.046 "name": "Existed_Raid", 00:17:40.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.046 "strip_size_kb": 0, 00:17:40.046 "state": "configuring", 00:17:40.046 "raid_level": "raid1", 00:17:40.046 "superblock": false, 00:17:40.046 "num_base_bdevs": 3, 00:17:40.046 "num_base_bdevs_discovered": 1, 00:17:40.046 "num_base_bdevs_operational": 3, 00:17:40.046 "base_bdevs_list": [ 00:17:40.046 { 00:17:40.046 "name": "BaseBdev1", 00:17:40.046 "uuid": "de545d80-626f-4d56-acc1-abeb7310529a", 00:17:40.046 "is_configured": true, 00:17:40.046 "data_offset": 0, 00:17:40.046 "data_size": 65536 00:17:40.046 }, 00:17:40.046 { 00:17:40.046 "name": "BaseBdev2", 00:17:40.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.046 "is_configured": false, 00:17:40.046 "data_offset": 0, 00:17:40.046 "data_size": 0 00:17:40.046 }, 00:17:40.046 { 00:17:40.046 "name": "BaseBdev3", 00:17:40.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.046 "is_configured": false, 00:17:40.046 "data_offset": 0, 00:17:40.046 "data_size": 0 00:17:40.046 } 00:17:40.046 ] 00:17:40.046 }' 00:17:40.046 11:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.046 11:58:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.613 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:40.872 [2024-07-15 11:58:54.334254] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:40.872 [2024-07-15 11:58:54.334294] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe08390 name Existed_Raid, state configuring 00:17:40.872 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:41.131 [2024-07-15 11:58:54.538826] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:41.131 [2024-07-15 11:58:54.540243] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:41.131 [2024-07-15 11:58:54.540276] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:41.131 [2024-07-15 11:58:54.540286] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:41.131 [2024-07-15 11:58:54.540298] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.131 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.390 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.390 "name": "Existed_Raid", 00:17:41.390 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.390 "strip_size_kb": 0, 00:17:41.390 "state": "configuring", 00:17:41.390 "raid_level": "raid1", 00:17:41.390 "superblock": false, 00:17:41.390 "num_base_bdevs": 3, 00:17:41.390 "num_base_bdevs_discovered": 1, 00:17:41.390 "num_base_bdevs_operational": 3, 00:17:41.390 "base_bdevs_list": [ 00:17:41.390 { 00:17:41.390 "name": "BaseBdev1", 00:17:41.390 "uuid": "de545d80-626f-4d56-acc1-abeb7310529a", 00:17:41.390 "is_configured": true, 00:17:41.390 "data_offset": 0, 00:17:41.390 "data_size": 65536 00:17:41.390 }, 00:17:41.390 { 00:17:41.390 "name": "BaseBdev2", 00:17:41.390 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.390 "is_configured": false, 00:17:41.390 "data_offset": 0, 00:17:41.390 "data_size": 0 00:17:41.390 }, 00:17:41.390 { 00:17:41.390 "name": "BaseBdev3", 00:17:41.390 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.390 "is_configured": false, 00:17:41.390 "data_offset": 0, 00:17:41.390 "data_size": 0 00:17:41.390 } 00:17:41.390 ] 00:17:41.390 }' 00:17:41.390 11:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.390 11:58:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.957 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:41.957 [2024-07-15 11:58:55.452778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:41.957 BaseBdev2 00:17:41.957 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:41.957 11:58:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:41.957 11:58:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:41.957 11:58:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:41.957 11:58:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:41.957 11:58:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:41.957 11:58:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:42.215 11:58:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:42.474 [ 00:17:42.474 { 00:17:42.474 "name": "BaseBdev2", 00:17:42.474 "aliases": [ 00:17:42.474 "050ee212-4a79-4079-9afa-b754af668170" 00:17:42.474 ], 00:17:42.474 "product_name": "Malloc disk", 00:17:42.474 "block_size": 512, 00:17:42.474 "num_blocks": 65536, 00:17:42.474 "uuid": "050ee212-4a79-4079-9afa-b754af668170", 00:17:42.474 "assigned_rate_limits": { 00:17:42.474 "rw_ios_per_sec": 0, 00:17:42.474 "rw_mbytes_per_sec": 0, 00:17:42.474 "r_mbytes_per_sec": 0, 00:17:42.474 "w_mbytes_per_sec": 0 00:17:42.474 }, 00:17:42.474 "claimed": true, 00:17:42.474 "claim_type": "exclusive_write", 00:17:42.474 "zoned": false, 00:17:42.474 "supported_io_types": { 00:17:42.474 "read": true, 00:17:42.474 "write": true, 00:17:42.474 "unmap": true, 00:17:42.474 "flush": true, 00:17:42.474 "reset": true, 00:17:42.474 "nvme_admin": false, 00:17:42.474 "nvme_io": false, 00:17:42.474 "nvme_io_md": false, 00:17:42.474 "write_zeroes": true, 00:17:42.474 "zcopy": true, 00:17:42.474 "get_zone_info": false, 00:17:42.474 "zone_management": false, 00:17:42.474 "zone_append": false, 00:17:42.474 "compare": false, 00:17:42.474 "compare_and_write": false, 00:17:42.474 "abort": true, 00:17:42.474 "seek_hole": false, 00:17:42.474 "seek_data": false, 00:17:42.474 "copy": true, 00:17:42.474 "nvme_iov_md": false 00:17:42.474 }, 00:17:42.474 "memory_domains": [ 00:17:42.474 { 00:17:42.474 "dma_device_id": "system", 00:17:42.474 "dma_device_type": 1 00:17:42.474 }, 00:17:42.474 { 00:17:42.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.474 "dma_device_type": 2 00:17:42.474 } 00:17:42.474 ], 00:17:42.474 "driver_specific": {} 00:17:42.474 } 00:17:42.474 ] 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.474 11:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.732 11:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.732 "name": "Existed_Raid", 00:17:42.732 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.732 "strip_size_kb": 0, 00:17:42.732 "state": "configuring", 00:17:42.732 "raid_level": "raid1", 00:17:42.733 "superblock": false, 00:17:42.733 "num_base_bdevs": 3, 00:17:42.733 "num_base_bdevs_discovered": 2, 00:17:42.733 "num_base_bdevs_operational": 3, 00:17:42.733 "base_bdevs_list": [ 00:17:42.733 { 00:17:42.733 "name": "BaseBdev1", 00:17:42.733 "uuid": "de545d80-626f-4d56-acc1-abeb7310529a", 00:17:42.733 "is_configured": true, 00:17:42.733 "data_offset": 0, 00:17:42.733 "data_size": 65536 00:17:42.733 }, 00:17:42.733 { 00:17:42.733 "name": "BaseBdev2", 00:17:42.733 "uuid": "050ee212-4a79-4079-9afa-b754af668170", 00:17:42.733 "is_configured": true, 00:17:42.733 "data_offset": 0, 00:17:42.733 "data_size": 65536 00:17:42.733 }, 00:17:42.733 { 00:17:42.733 "name": "BaseBdev3", 00:17:42.733 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.733 "is_configured": false, 00:17:42.733 "data_offset": 0, 00:17:42.733 "data_size": 0 00:17:42.733 } 00:17:42.733 ] 00:17:42.733 }' 00:17:42.733 11:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.733 11:58:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.300 11:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:43.559 [2024-07-15 11:58:57.096528] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:43.559 [2024-07-15 11:58:57.096562] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe09480 00:17:43.559 [2024-07-15 11:58:57.096570] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:17:43.559 [2024-07-15 11:58:57.096788] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfbb570 00:17:43.559 [2024-07-15 11:58:57.096908] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe09480 00:17:43.559 [2024-07-15 11:58:57.096918] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe09480 00:17:43.559 [2024-07-15 11:58:57.097080] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:43.559 BaseBdev3 00:17:43.559 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:43.559 11:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:43.559 11:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:43.559 11:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:43.559 11:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:43.559 11:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:43.559 11:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:43.818 11:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:44.076 [ 00:17:44.076 { 00:17:44.076 "name": "BaseBdev3", 00:17:44.076 "aliases": [ 00:17:44.076 "44d5259b-9cd0-4237-9903-2e4d53645d86" 00:17:44.076 ], 00:17:44.076 "product_name": "Malloc disk", 00:17:44.076 "block_size": 512, 00:17:44.076 "num_blocks": 65536, 00:17:44.076 "uuid": "44d5259b-9cd0-4237-9903-2e4d53645d86", 00:17:44.076 "assigned_rate_limits": { 00:17:44.076 "rw_ios_per_sec": 0, 00:17:44.076 "rw_mbytes_per_sec": 0, 00:17:44.076 "r_mbytes_per_sec": 0, 00:17:44.077 "w_mbytes_per_sec": 0 00:17:44.077 }, 00:17:44.077 "claimed": true, 00:17:44.077 "claim_type": "exclusive_write", 00:17:44.077 "zoned": false, 00:17:44.077 "supported_io_types": { 00:17:44.077 "read": true, 00:17:44.077 "write": true, 00:17:44.077 "unmap": true, 00:17:44.077 "flush": true, 00:17:44.077 "reset": true, 00:17:44.077 "nvme_admin": false, 00:17:44.077 "nvme_io": false, 00:17:44.077 "nvme_io_md": false, 00:17:44.077 "write_zeroes": true, 00:17:44.077 "zcopy": true, 00:17:44.077 "get_zone_info": false, 00:17:44.077 "zone_management": false, 00:17:44.077 "zone_append": false, 00:17:44.077 "compare": false, 00:17:44.077 "compare_and_write": false, 00:17:44.077 "abort": true, 00:17:44.077 "seek_hole": false, 00:17:44.077 "seek_data": false, 00:17:44.077 "copy": true, 00:17:44.077 "nvme_iov_md": false 00:17:44.077 }, 00:17:44.077 "memory_domains": [ 00:17:44.077 { 00:17:44.077 "dma_device_id": "system", 00:17:44.077 "dma_device_type": 1 00:17:44.077 }, 00:17:44.077 { 00:17:44.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:44.077 "dma_device_type": 2 00:17:44.077 } 00:17:44.077 ], 00:17:44.077 "driver_specific": {} 00:17:44.077 } 00:17:44.077 ] 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.077 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.336 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.336 "name": "Existed_Raid", 00:17:44.336 "uuid": "c527682f-b0ca-4d56-a6a7-42087f51778d", 00:17:44.336 "strip_size_kb": 0, 00:17:44.336 "state": "online", 00:17:44.336 "raid_level": "raid1", 00:17:44.336 "superblock": false, 00:17:44.336 "num_base_bdevs": 3, 00:17:44.336 "num_base_bdevs_discovered": 3, 00:17:44.336 "num_base_bdevs_operational": 3, 00:17:44.336 "base_bdevs_list": [ 00:17:44.336 { 00:17:44.336 "name": "BaseBdev1", 00:17:44.336 "uuid": "de545d80-626f-4d56-acc1-abeb7310529a", 00:17:44.336 "is_configured": true, 00:17:44.336 "data_offset": 0, 00:17:44.336 "data_size": 65536 00:17:44.336 }, 00:17:44.336 { 00:17:44.336 "name": "BaseBdev2", 00:17:44.336 "uuid": "050ee212-4a79-4079-9afa-b754af668170", 00:17:44.336 "is_configured": true, 00:17:44.336 "data_offset": 0, 00:17:44.336 "data_size": 65536 00:17:44.336 }, 00:17:44.336 { 00:17:44.336 "name": "BaseBdev3", 00:17:44.336 "uuid": "44d5259b-9cd0-4237-9903-2e4d53645d86", 00:17:44.336 "is_configured": true, 00:17:44.336 "data_offset": 0, 00:17:44.336 "data_size": 65536 00:17:44.336 } 00:17:44.336 ] 00:17:44.336 }' 00:17:44.336 11:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.336 11:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.903 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:44.903 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:44.903 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:44.903 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:44.903 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:44.903 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:44.903 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:44.903 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:45.162 [2024-07-15 11:58:58.628890] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:45.162 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:45.162 "name": "Existed_Raid", 00:17:45.162 "aliases": [ 00:17:45.163 "c527682f-b0ca-4d56-a6a7-42087f51778d" 00:17:45.163 ], 00:17:45.163 "product_name": "Raid Volume", 00:17:45.163 "block_size": 512, 00:17:45.163 "num_blocks": 65536, 00:17:45.163 "uuid": "c527682f-b0ca-4d56-a6a7-42087f51778d", 00:17:45.163 "assigned_rate_limits": { 00:17:45.163 "rw_ios_per_sec": 0, 00:17:45.163 "rw_mbytes_per_sec": 0, 00:17:45.163 "r_mbytes_per_sec": 0, 00:17:45.163 "w_mbytes_per_sec": 0 00:17:45.163 }, 00:17:45.163 "claimed": false, 00:17:45.163 "zoned": false, 00:17:45.163 "supported_io_types": { 00:17:45.163 "read": true, 00:17:45.163 "write": true, 00:17:45.163 "unmap": false, 00:17:45.163 "flush": false, 00:17:45.163 "reset": true, 00:17:45.163 "nvme_admin": false, 00:17:45.163 "nvme_io": false, 00:17:45.163 "nvme_io_md": false, 00:17:45.163 "write_zeroes": true, 00:17:45.163 "zcopy": false, 00:17:45.163 "get_zone_info": false, 00:17:45.163 "zone_management": false, 00:17:45.163 "zone_append": false, 00:17:45.163 "compare": false, 00:17:45.163 "compare_and_write": false, 00:17:45.163 "abort": false, 00:17:45.163 "seek_hole": false, 00:17:45.163 "seek_data": false, 00:17:45.163 "copy": false, 00:17:45.163 "nvme_iov_md": false 00:17:45.163 }, 00:17:45.163 "memory_domains": [ 00:17:45.163 { 00:17:45.163 "dma_device_id": "system", 00:17:45.163 "dma_device_type": 1 00:17:45.163 }, 00:17:45.163 { 00:17:45.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.163 "dma_device_type": 2 00:17:45.163 }, 00:17:45.163 { 00:17:45.163 "dma_device_id": "system", 00:17:45.163 "dma_device_type": 1 00:17:45.163 }, 00:17:45.163 { 00:17:45.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.163 "dma_device_type": 2 00:17:45.163 }, 00:17:45.163 { 00:17:45.163 "dma_device_id": "system", 00:17:45.163 "dma_device_type": 1 00:17:45.163 }, 00:17:45.163 { 00:17:45.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.163 "dma_device_type": 2 00:17:45.163 } 00:17:45.163 ], 00:17:45.163 "driver_specific": { 00:17:45.163 "raid": { 00:17:45.163 "uuid": "c527682f-b0ca-4d56-a6a7-42087f51778d", 00:17:45.163 "strip_size_kb": 0, 00:17:45.163 "state": "online", 00:17:45.163 "raid_level": "raid1", 00:17:45.163 "superblock": false, 00:17:45.163 "num_base_bdevs": 3, 00:17:45.163 "num_base_bdevs_discovered": 3, 00:17:45.163 "num_base_bdevs_operational": 3, 00:17:45.163 "base_bdevs_list": [ 00:17:45.163 { 00:17:45.163 "name": "BaseBdev1", 00:17:45.163 "uuid": "de545d80-626f-4d56-acc1-abeb7310529a", 00:17:45.163 "is_configured": true, 00:17:45.163 "data_offset": 0, 00:17:45.163 "data_size": 65536 00:17:45.163 }, 00:17:45.163 { 00:17:45.163 "name": "BaseBdev2", 00:17:45.163 "uuid": "050ee212-4a79-4079-9afa-b754af668170", 00:17:45.163 "is_configured": true, 00:17:45.163 "data_offset": 0, 00:17:45.163 "data_size": 65536 00:17:45.163 }, 00:17:45.163 { 00:17:45.163 "name": "BaseBdev3", 00:17:45.163 "uuid": "44d5259b-9cd0-4237-9903-2e4d53645d86", 00:17:45.163 "is_configured": true, 00:17:45.163 "data_offset": 0, 00:17:45.163 "data_size": 65536 00:17:45.163 } 00:17:45.163 ] 00:17:45.163 } 00:17:45.163 } 00:17:45.163 }' 00:17:45.163 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:45.163 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:45.163 BaseBdev2 00:17:45.163 BaseBdev3' 00:17:45.163 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.163 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:45.163 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.423 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.423 "name": "BaseBdev1", 00:17:45.423 "aliases": [ 00:17:45.423 "de545d80-626f-4d56-acc1-abeb7310529a" 00:17:45.423 ], 00:17:45.423 "product_name": "Malloc disk", 00:17:45.423 "block_size": 512, 00:17:45.423 "num_blocks": 65536, 00:17:45.423 "uuid": "de545d80-626f-4d56-acc1-abeb7310529a", 00:17:45.423 "assigned_rate_limits": { 00:17:45.423 "rw_ios_per_sec": 0, 00:17:45.423 "rw_mbytes_per_sec": 0, 00:17:45.423 "r_mbytes_per_sec": 0, 00:17:45.423 "w_mbytes_per_sec": 0 00:17:45.423 }, 00:17:45.423 "claimed": true, 00:17:45.423 "claim_type": "exclusive_write", 00:17:45.423 "zoned": false, 00:17:45.423 "supported_io_types": { 00:17:45.423 "read": true, 00:17:45.423 "write": true, 00:17:45.423 "unmap": true, 00:17:45.423 "flush": true, 00:17:45.423 "reset": true, 00:17:45.423 "nvme_admin": false, 00:17:45.423 "nvme_io": false, 00:17:45.423 "nvme_io_md": false, 00:17:45.423 "write_zeroes": true, 00:17:45.423 "zcopy": true, 00:17:45.423 "get_zone_info": false, 00:17:45.423 "zone_management": false, 00:17:45.423 "zone_append": false, 00:17:45.423 "compare": false, 00:17:45.423 "compare_and_write": false, 00:17:45.423 "abort": true, 00:17:45.423 "seek_hole": false, 00:17:45.423 "seek_data": false, 00:17:45.423 "copy": true, 00:17:45.423 "nvme_iov_md": false 00:17:45.423 }, 00:17:45.423 "memory_domains": [ 00:17:45.423 { 00:17:45.423 "dma_device_id": "system", 00:17:45.423 "dma_device_type": 1 00:17:45.423 }, 00:17:45.423 { 00:17:45.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.423 "dma_device_type": 2 00:17:45.423 } 00:17:45.423 ], 00:17:45.423 "driver_specific": {} 00:17:45.423 }' 00:17:45.423 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.423 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.423 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:45.423 11:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:45.683 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:45.942 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:45.942 "name": "BaseBdev2", 00:17:45.942 "aliases": [ 00:17:45.942 "050ee212-4a79-4079-9afa-b754af668170" 00:17:45.942 ], 00:17:45.942 "product_name": "Malloc disk", 00:17:45.942 "block_size": 512, 00:17:45.942 "num_blocks": 65536, 00:17:45.942 "uuid": "050ee212-4a79-4079-9afa-b754af668170", 00:17:45.942 "assigned_rate_limits": { 00:17:45.942 "rw_ios_per_sec": 0, 00:17:45.942 "rw_mbytes_per_sec": 0, 00:17:45.942 "r_mbytes_per_sec": 0, 00:17:45.942 "w_mbytes_per_sec": 0 00:17:45.942 }, 00:17:45.942 "claimed": true, 00:17:45.942 "claim_type": "exclusive_write", 00:17:45.942 "zoned": false, 00:17:45.942 "supported_io_types": { 00:17:45.942 "read": true, 00:17:45.942 "write": true, 00:17:45.942 "unmap": true, 00:17:45.942 "flush": true, 00:17:45.942 "reset": true, 00:17:45.942 "nvme_admin": false, 00:17:45.942 "nvme_io": false, 00:17:45.942 "nvme_io_md": false, 00:17:45.942 "write_zeroes": true, 00:17:45.942 "zcopy": true, 00:17:45.942 "get_zone_info": false, 00:17:45.942 "zone_management": false, 00:17:45.942 "zone_append": false, 00:17:45.942 "compare": false, 00:17:45.942 "compare_and_write": false, 00:17:45.942 "abort": true, 00:17:45.942 "seek_hole": false, 00:17:45.942 "seek_data": false, 00:17:45.942 "copy": true, 00:17:45.942 "nvme_iov_md": false 00:17:45.942 }, 00:17:45.942 "memory_domains": [ 00:17:45.942 { 00:17:45.942 "dma_device_id": "system", 00:17:45.942 "dma_device_type": 1 00:17:45.942 }, 00:17:45.942 { 00:17:45.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:45.942 "dma_device_type": 2 00:17:45.942 } 00:17:45.942 ], 00:17:45.942 "driver_specific": {} 00:17:45.942 }' 00:17:45.942 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:45.942 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.202 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:46.202 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.202 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.202 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:46.202 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.202 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.202 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.202 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.202 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.461 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.461 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:46.461 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:46.461 11:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:46.720 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:46.720 "name": "BaseBdev3", 00:17:46.720 "aliases": [ 00:17:46.720 "44d5259b-9cd0-4237-9903-2e4d53645d86" 00:17:46.720 ], 00:17:46.720 "product_name": "Malloc disk", 00:17:46.720 "block_size": 512, 00:17:46.720 "num_blocks": 65536, 00:17:46.720 "uuid": "44d5259b-9cd0-4237-9903-2e4d53645d86", 00:17:46.720 "assigned_rate_limits": { 00:17:46.720 "rw_ios_per_sec": 0, 00:17:46.720 "rw_mbytes_per_sec": 0, 00:17:46.720 "r_mbytes_per_sec": 0, 00:17:46.720 "w_mbytes_per_sec": 0 00:17:46.720 }, 00:17:46.720 "claimed": true, 00:17:46.720 "claim_type": "exclusive_write", 00:17:46.720 "zoned": false, 00:17:46.720 "supported_io_types": { 00:17:46.720 "read": true, 00:17:46.720 "write": true, 00:17:46.720 "unmap": true, 00:17:46.720 "flush": true, 00:17:46.720 "reset": true, 00:17:46.720 "nvme_admin": false, 00:17:46.720 "nvme_io": false, 00:17:46.720 "nvme_io_md": false, 00:17:46.720 "write_zeroes": true, 00:17:46.720 "zcopy": true, 00:17:46.720 "get_zone_info": false, 00:17:46.720 "zone_management": false, 00:17:46.720 "zone_append": false, 00:17:46.720 "compare": false, 00:17:46.720 "compare_and_write": false, 00:17:46.720 "abort": true, 00:17:46.720 "seek_hole": false, 00:17:46.720 "seek_data": false, 00:17:46.720 "copy": true, 00:17:46.720 "nvme_iov_md": false 00:17:46.720 }, 00:17:46.720 "memory_domains": [ 00:17:46.720 { 00:17:46.720 "dma_device_id": "system", 00:17:46.720 "dma_device_type": 1 00:17:46.720 }, 00:17:46.720 { 00:17:46.720 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.720 "dma_device_type": 2 00:17:46.720 } 00:17:46.720 ], 00:17:46.720 "driver_specific": {} 00:17:46.720 }' 00:17:46.720 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.720 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:46.720 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:46.720 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.720 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:46.720 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:46.720 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.980 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:46.980 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:46.980 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.980 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:46.980 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:46.980 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:47.239 [2024-07-15 11:59:00.682095] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.239 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:47.499 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:47.499 "name": "Existed_Raid", 00:17:47.499 "uuid": "c527682f-b0ca-4d56-a6a7-42087f51778d", 00:17:47.499 "strip_size_kb": 0, 00:17:47.499 "state": "online", 00:17:47.499 "raid_level": "raid1", 00:17:47.499 "superblock": false, 00:17:47.499 "num_base_bdevs": 3, 00:17:47.499 "num_base_bdevs_discovered": 2, 00:17:47.499 "num_base_bdevs_operational": 2, 00:17:47.499 "base_bdevs_list": [ 00:17:47.499 { 00:17:47.499 "name": null, 00:17:47.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:47.499 "is_configured": false, 00:17:47.499 "data_offset": 0, 00:17:47.499 "data_size": 65536 00:17:47.499 }, 00:17:47.499 { 00:17:47.499 "name": "BaseBdev2", 00:17:47.499 "uuid": "050ee212-4a79-4079-9afa-b754af668170", 00:17:47.499 "is_configured": true, 00:17:47.499 "data_offset": 0, 00:17:47.499 "data_size": 65536 00:17:47.499 }, 00:17:47.499 { 00:17:47.499 "name": "BaseBdev3", 00:17:47.499 "uuid": "44d5259b-9cd0-4237-9903-2e4d53645d86", 00:17:47.499 "is_configured": true, 00:17:47.499 "data_offset": 0, 00:17:47.499 "data_size": 65536 00:17:47.499 } 00:17:47.499 ] 00:17:47.499 }' 00:17:47.499 11:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:47.499 11:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.065 11:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:48.065 11:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:48.065 11:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.065 11:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:48.322 11:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:48.322 11:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:48.322 11:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:48.581 [2024-07-15 11:59:02.062943] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:48.581 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:48.582 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:48.582 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:48.582 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:48.841 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:48.841 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:48.841 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:49.100 [2024-07-15 11:59:02.594948] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:49.100 [2024-07-15 11:59:02.595035] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:49.100 [2024-07-15 11:59:02.607637] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:49.100 [2024-07-15 11:59:02.607673] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:49.100 [2024-07-15 11:59:02.607692] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe09480 name Existed_Raid, state offline 00:17:49.100 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:49.100 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:49.100 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.100 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:49.402 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:49.402 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:49.402 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:17:49.402 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:49.402 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:49.402 11:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:49.661 BaseBdev2 00:17:49.662 11:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:49.662 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:17:49.662 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:49.662 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:49.662 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:49.662 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:49.662 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:49.920 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:50.179 [ 00:17:50.179 { 00:17:50.179 "name": "BaseBdev2", 00:17:50.179 "aliases": [ 00:17:50.179 "95d09d3a-4aa7-4c2f-951a-ef41c80da560" 00:17:50.179 ], 00:17:50.179 "product_name": "Malloc disk", 00:17:50.179 "block_size": 512, 00:17:50.179 "num_blocks": 65536, 00:17:50.179 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:17:50.179 "assigned_rate_limits": { 00:17:50.179 "rw_ios_per_sec": 0, 00:17:50.179 "rw_mbytes_per_sec": 0, 00:17:50.179 "r_mbytes_per_sec": 0, 00:17:50.179 "w_mbytes_per_sec": 0 00:17:50.179 }, 00:17:50.179 "claimed": false, 00:17:50.179 "zoned": false, 00:17:50.179 "supported_io_types": { 00:17:50.179 "read": true, 00:17:50.179 "write": true, 00:17:50.179 "unmap": true, 00:17:50.179 "flush": true, 00:17:50.179 "reset": true, 00:17:50.179 "nvme_admin": false, 00:17:50.179 "nvme_io": false, 00:17:50.179 "nvme_io_md": false, 00:17:50.179 "write_zeroes": true, 00:17:50.179 "zcopy": true, 00:17:50.179 "get_zone_info": false, 00:17:50.179 "zone_management": false, 00:17:50.179 "zone_append": false, 00:17:50.179 "compare": false, 00:17:50.179 "compare_and_write": false, 00:17:50.179 "abort": true, 00:17:50.179 "seek_hole": false, 00:17:50.179 "seek_data": false, 00:17:50.179 "copy": true, 00:17:50.179 "nvme_iov_md": false 00:17:50.179 }, 00:17:50.179 "memory_domains": [ 00:17:50.179 { 00:17:50.179 "dma_device_id": "system", 00:17:50.179 "dma_device_type": 1 00:17:50.179 }, 00:17:50.179 { 00:17:50.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.179 "dma_device_type": 2 00:17:50.179 } 00:17:50.179 ], 00:17:50.179 "driver_specific": {} 00:17:50.179 } 00:17:50.179 ] 00:17:50.179 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:50.179 11:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:50.179 11:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:50.179 11:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:50.436 BaseBdev3 00:17:50.436 11:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:50.436 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:17:50.436 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:50.436 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:50.436 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:50.436 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:50.436 11:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:50.694 11:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:50.953 [ 00:17:50.953 { 00:17:50.953 "name": "BaseBdev3", 00:17:50.953 "aliases": [ 00:17:50.953 "dcf5c137-3ace-456c-8417-779ef028cb16" 00:17:50.953 ], 00:17:50.953 "product_name": "Malloc disk", 00:17:50.953 "block_size": 512, 00:17:50.953 "num_blocks": 65536, 00:17:50.953 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:17:50.953 "assigned_rate_limits": { 00:17:50.953 "rw_ios_per_sec": 0, 00:17:50.953 "rw_mbytes_per_sec": 0, 00:17:50.953 "r_mbytes_per_sec": 0, 00:17:50.953 "w_mbytes_per_sec": 0 00:17:50.953 }, 00:17:50.953 "claimed": false, 00:17:50.953 "zoned": false, 00:17:50.953 "supported_io_types": { 00:17:50.953 "read": true, 00:17:50.953 "write": true, 00:17:50.953 "unmap": true, 00:17:50.953 "flush": true, 00:17:50.953 "reset": true, 00:17:50.953 "nvme_admin": false, 00:17:50.953 "nvme_io": false, 00:17:50.953 "nvme_io_md": false, 00:17:50.953 "write_zeroes": true, 00:17:50.953 "zcopy": true, 00:17:50.953 "get_zone_info": false, 00:17:50.953 "zone_management": false, 00:17:50.953 "zone_append": false, 00:17:50.953 "compare": false, 00:17:50.953 "compare_and_write": false, 00:17:50.953 "abort": true, 00:17:50.953 "seek_hole": false, 00:17:50.953 "seek_data": false, 00:17:50.953 "copy": true, 00:17:50.953 "nvme_iov_md": false 00:17:50.953 }, 00:17:50.953 "memory_domains": [ 00:17:50.953 { 00:17:50.953 "dma_device_id": "system", 00:17:50.953 "dma_device_type": 1 00:17:50.953 }, 00:17:50.953 { 00:17:50.953 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:50.953 "dma_device_type": 2 00:17:50.953 } 00:17:50.953 ], 00:17:50.953 "driver_specific": {} 00:17:50.953 } 00:17:50.953 ] 00:17:50.953 11:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:50.953 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:50.953 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:50.953 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:17:51.213 [2024-07-15 11:59:04.565134] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:51.213 [2024-07-15 11:59:04.565176] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:51.213 [2024-07-15 11:59:04.565195] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:51.213 [2024-07-15 11:59:04.566734] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:51.213 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:51.475 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:51.475 "name": "Existed_Raid", 00:17:51.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.475 "strip_size_kb": 0, 00:17:51.475 "state": "configuring", 00:17:51.475 "raid_level": "raid1", 00:17:51.475 "superblock": false, 00:17:51.475 "num_base_bdevs": 3, 00:17:51.475 "num_base_bdevs_discovered": 2, 00:17:51.475 "num_base_bdevs_operational": 3, 00:17:51.475 "base_bdevs_list": [ 00:17:51.475 { 00:17:51.475 "name": "BaseBdev1", 00:17:51.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:51.475 "is_configured": false, 00:17:51.475 "data_offset": 0, 00:17:51.475 "data_size": 0 00:17:51.475 }, 00:17:51.475 { 00:17:51.475 "name": "BaseBdev2", 00:17:51.475 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:17:51.475 "is_configured": true, 00:17:51.475 "data_offset": 0, 00:17:51.475 "data_size": 65536 00:17:51.475 }, 00:17:51.475 { 00:17:51.475 "name": "BaseBdev3", 00:17:51.475 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:17:51.475 "is_configured": true, 00:17:51.475 "data_offset": 0, 00:17:51.475 "data_size": 65536 00:17:51.475 } 00:17:51.475 ] 00:17:51.475 }' 00:17:51.475 11:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:51.475 11:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.045 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:52.305 [2024-07-15 11:59:05.668045] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.305 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.579 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.579 "name": "Existed_Raid", 00:17:52.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.579 "strip_size_kb": 0, 00:17:52.579 "state": "configuring", 00:17:52.579 "raid_level": "raid1", 00:17:52.579 "superblock": false, 00:17:52.579 "num_base_bdevs": 3, 00:17:52.579 "num_base_bdevs_discovered": 1, 00:17:52.579 "num_base_bdevs_operational": 3, 00:17:52.579 "base_bdevs_list": [ 00:17:52.579 { 00:17:52.579 "name": "BaseBdev1", 00:17:52.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.579 "is_configured": false, 00:17:52.579 "data_offset": 0, 00:17:52.579 "data_size": 0 00:17:52.579 }, 00:17:52.579 { 00:17:52.579 "name": null, 00:17:52.579 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:17:52.579 "is_configured": false, 00:17:52.579 "data_offset": 0, 00:17:52.579 "data_size": 65536 00:17:52.579 }, 00:17:52.579 { 00:17:52.579 "name": "BaseBdev3", 00:17:52.579 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:17:52.579 "is_configured": true, 00:17:52.579 "data_offset": 0, 00:17:52.579 "data_size": 65536 00:17:52.579 } 00:17:52.579 ] 00:17:52.579 }' 00:17:52.579 11:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.579 11:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:52.960 11:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.960 11:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:53.219 11:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:53.219 11:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:53.479 [2024-07-15 11:59:06.947968] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:53.479 BaseBdev1 00:17:53.479 11:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:53.479 11:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:17:53.479 11:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:17:53.479 11:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:17:53.479 11:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:17:53.479 11:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:17:53.479 11:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:53.738 11:59:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:53.997 [ 00:17:53.997 { 00:17:53.997 "name": "BaseBdev1", 00:17:53.997 "aliases": [ 00:17:53.997 "6f60690b-7abb-4743-af9c-b042fd464811" 00:17:53.997 ], 00:17:53.997 "product_name": "Malloc disk", 00:17:53.997 "block_size": 512, 00:17:53.997 "num_blocks": 65536, 00:17:53.997 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:17:53.997 "assigned_rate_limits": { 00:17:53.997 "rw_ios_per_sec": 0, 00:17:53.997 "rw_mbytes_per_sec": 0, 00:17:53.997 "r_mbytes_per_sec": 0, 00:17:53.997 "w_mbytes_per_sec": 0 00:17:53.997 }, 00:17:53.997 "claimed": true, 00:17:53.997 "claim_type": "exclusive_write", 00:17:53.997 "zoned": false, 00:17:53.997 "supported_io_types": { 00:17:53.997 "read": true, 00:17:53.997 "write": true, 00:17:53.997 "unmap": true, 00:17:53.997 "flush": true, 00:17:53.997 "reset": true, 00:17:53.997 "nvme_admin": false, 00:17:53.997 "nvme_io": false, 00:17:53.997 "nvme_io_md": false, 00:17:53.997 "write_zeroes": true, 00:17:53.997 "zcopy": true, 00:17:53.997 "get_zone_info": false, 00:17:53.997 "zone_management": false, 00:17:53.997 "zone_append": false, 00:17:53.997 "compare": false, 00:17:53.997 "compare_and_write": false, 00:17:53.997 "abort": true, 00:17:53.997 "seek_hole": false, 00:17:53.997 "seek_data": false, 00:17:53.997 "copy": true, 00:17:53.997 "nvme_iov_md": false 00:17:53.997 }, 00:17:53.997 "memory_domains": [ 00:17:53.998 { 00:17:53.998 "dma_device_id": "system", 00:17:53.998 "dma_device_type": 1 00:17:53.998 }, 00:17:53.998 { 00:17:53.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:53.998 "dma_device_type": 2 00:17:53.998 } 00:17:53.998 ], 00:17:53.998 "driver_specific": {} 00:17:53.998 } 00:17:53.998 ] 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.998 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:54.257 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:54.257 "name": "Existed_Raid", 00:17:54.257 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:54.257 "strip_size_kb": 0, 00:17:54.257 "state": "configuring", 00:17:54.257 "raid_level": "raid1", 00:17:54.257 "superblock": false, 00:17:54.257 "num_base_bdevs": 3, 00:17:54.257 "num_base_bdevs_discovered": 2, 00:17:54.257 "num_base_bdevs_operational": 3, 00:17:54.257 "base_bdevs_list": [ 00:17:54.257 { 00:17:54.257 "name": "BaseBdev1", 00:17:54.257 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:17:54.257 "is_configured": true, 00:17:54.257 "data_offset": 0, 00:17:54.257 "data_size": 65536 00:17:54.257 }, 00:17:54.257 { 00:17:54.257 "name": null, 00:17:54.257 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:17:54.257 "is_configured": false, 00:17:54.257 "data_offset": 0, 00:17:54.257 "data_size": 65536 00:17:54.257 }, 00:17:54.257 { 00:17:54.257 "name": "BaseBdev3", 00:17:54.257 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:17:54.257 "is_configured": true, 00:17:54.257 "data_offset": 0, 00:17:54.257 "data_size": 65536 00:17:54.257 } 00:17:54.257 ] 00:17:54.257 }' 00:17:54.257 11:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:54.257 11:59:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:54.826 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.826 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:55.085 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:55.085 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:55.344 [2024-07-15 11:59:08.688614] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.344 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.604 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.604 "name": "Existed_Raid", 00:17:55.604 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.604 "strip_size_kb": 0, 00:17:55.604 "state": "configuring", 00:17:55.604 "raid_level": "raid1", 00:17:55.604 "superblock": false, 00:17:55.604 "num_base_bdevs": 3, 00:17:55.604 "num_base_bdevs_discovered": 1, 00:17:55.604 "num_base_bdevs_operational": 3, 00:17:55.604 "base_bdevs_list": [ 00:17:55.604 { 00:17:55.604 "name": "BaseBdev1", 00:17:55.604 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:17:55.604 "is_configured": true, 00:17:55.604 "data_offset": 0, 00:17:55.604 "data_size": 65536 00:17:55.604 }, 00:17:55.604 { 00:17:55.604 "name": null, 00:17:55.604 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:17:55.604 "is_configured": false, 00:17:55.604 "data_offset": 0, 00:17:55.604 "data_size": 65536 00:17:55.604 }, 00:17:55.604 { 00:17:55.604 "name": null, 00:17:55.604 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:17:55.604 "is_configured": false, 00:17:55.604 "data_offset": 0, 00:17:55.604 "data_size": 65536 00:17:55.604 } 00:17:55.604 ] 00:17:55.604 }' 00:17:55.604 11:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.604 11:59:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.173 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:56.173 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.173 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:56.173 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:56.432 [2024-07-15 11:59:09.963996] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.432 11:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.692 11:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.692 "name": "Existed_Raid", 00:17:56.692 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.692 "strip_size_kb": 0, 00:17:56.692 "state": "configuring", 00:17:56.692 "raid_level": "raid1", 00:17:56.692 "superblock": false, 00:17:56.692 "num_base_bdevs": 3, 00:17:56.692 "num_base_bdevs_discovered": 2, 00:17:56.692 "num_base_bdevs_operational": 3, 00:17:56.692 "base_bdevs_list": [ 00:17:56.692 { 00:17:56.692 "name": "BaseBdev1", 00:17:56.692 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:17:56.692 "is_configured": true, 00:17:56.692 "data_offset": 0, 00:17:56.692 "data_size": 65536 00:17:56.692 }, 00:17:56.692 { 00:17:56.692 "name": null, 00:17:56.692 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:17:56.692 "is_configured": false, 00:17:56.692 "data_offset": 0, 00:17:56.692 "data_size": 65536 00:17:56.692 }, 00:17:56.692 { 00:17:56.692 "name": "BaseBdev3", 00:17:56.692 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:17:56.692 "is_configured": true, 00:17:56.692 "data_offset": 0, 00:17:56.692 "data_size": 65536 00:17:56.692 } 00:17:56.692 ] 00:17:56.692 }' 00:17:56.692 11:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.692 11:59:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:57.260 11:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.260 11:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:57.520 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:57.520 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:57.779 [2024-07-15 11:59:11.255564] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.779 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:58.039 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:58.039 "name": "Existed_Raid", 00:17:58.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:58.039 "strip_size_kb": 0, 00:17:58.039 "state": "configuring", 00:17:58.039 "raid_level": "raid1", 00:17:58.039 "superblock": false, 00:17:58.039 "num_base_bdevs": 3, 00:17:58.039 "num_base_bdevs_discovered": 1, 00:17:58.039 "num_base_bdevs_operational": 3, 00:17:58.039 "base_bdevs_list": [ 00:17:58.039 { 00:17:58.039 "name": null, 00:17:58.039 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:17:58.039 "is_configured": false, 00:17:58.039 "data_offset": 0, 00:17:58.039 "data_size": 65536 00:17:58.039 }, 00:17:58.039 { 00:17:58.039 "name": null, 00:17:58.039 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:17:58.039 "is_configured": false, 00:17:58.039 "data_offset": 0, 00:17:58.039 "data_size": 65536 00:17:58.039 }, 00:17:58.039 { 00:17:58.039 "name": "BaseBdev3", 00:17:58.039 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:17:58.039 "is_configured": true, 00:17:58.039 "data_offset": 0, 00:17:58.039 "data_size": 65536 00:17:58.039 } 00:17:58.039 ] 00:17:58.039 }' 00:17:58.039 11:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:58.039 11:59:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.606 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:58.606 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:58.864 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:58.864 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:59.123 [2024-07-15 11:59:12.633764] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.123 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:59.381 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:59.381 "name": "Existed_Raid", 00:17:59.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:59.381 "strip_size_kb": 0, 00:17:59.381 "state": "configuring", 00:17:59.381 "raid_level": "raid1", 00:17:59.381 "superblock": false, 00:17:59.381 "num_base_bdevs": 3, 00:17:59.381 "num_base_bdevs_discovered": 2, 00:17:59.381 "num_base_bdevs_operational": 3, 00:17:59.381 "base_bdevs_list": [ 00:17:59.382 { 00:17:59.382 "name": null, 00:17:59.382 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:17:59.382 "is_configured": false, 00:17:59.382 "data_offset": 0, 00:17:59.382 "data_size": 65536 00:17:59.382 }, 00:17:59.382 { 00:17:59.382 "name": "BaseBdev2", 00:17:59.382 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:17:59.382 "is_configured": true, 00:17:59.382 "data_offset": 0, 00:17:59.382 "data_size": 65536 00:17:59.382 }, 00:17:59.382 { 00:17:59.382 "name": "BaseBdev3", 00:17:59.382 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:17:59.382 "is_configured": true, 00:17:59.382 "data_offset": 0, 00:17:59.382 "data_size": 65536 00:17:59.382 } 00:17:59.382 ] 00:17:59.382 }' 00:17:59.382 11:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:59.382 11:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:59.949 11:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.949 11:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:00.207 11:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:00.208 11:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.208 11:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:00.467 11:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 6f60690b-7abb-4743-af9c-b042fd464811 00:18:00.726 [2024-07-15 11:59:14.182411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:00.726 [2024-07-15 11:59:14.182449] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe0c5e0 00:18:00.726 [2024-07-15 11:59:14.182458] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:18:00.726 [2024-07-15 11:59:14.182651] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe09ee0 00:18:00.726 [2024-07-15 11:59:14.182783] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe0c5e0 00:18:00.726 [2024-07-15 11:59:14.182794] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xe0c5e0 00:18:00.726 [2024-07-15 11:59:14.182959] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:00.726 NewBaseBdev 00:18:00.726 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:00.726 11:59:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:00.726 11:59:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:00.726 11:59:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:18:00.726 11:59:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:00.726 11:59:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:00.726 11:59:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:00.986 11:59:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:01.245 [ 00:18:01.245 { 00:18:01.245 "name": "NewBaseBdev", 00:18:01.245 "aliases": [ 00:18:01.245 "6f60690b-7abb-4743-af9c-b042fd464811" 00:18:01.245 ], 00:18:01.245 "product_name": "Malloc disk", 00:18:01.245 "block_size": 512, 00:18:01.245 "num_blocks": 65536, 00:18:01.245 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:18:01.245 "assigned_rate_limits": { 00:18:01.245 "rw_ios_per_sec": 0, 00:18:01.245 "rw_mbytes_per_sec": 0, 00:18:01.245 "r_mbytes_per_sec": 0, 00:18:01.245 "w_mbytes_per_sec": 0 00:18:01.245 }, 00:18:01.245 "claimed": true, 00:18:01.245 "claim_type": "exclusive_write", 00:18:01.245 "zoned": false, 00:18:01.245 "supported_io_types": { 00:18:01.245 "read": true, 00:18:01.245 "write": true, 00:18:01.245 "unmap": true, 00:18:01.245 "flush": true, 00:18:01.245 "reset": true, 00:18:01.245 "nvme_admin": false, 00:18:01.245 "nvme_io": false, 00:18:01.245 "nvme_io_md": false, 00:18:01.245 "write_zeroes": true, 00:18:01.245 "zcopy": true, 00:18:01.245 "get_zone_info": false, 00:18:01.245 "zone_management": false, 00:18:01.245 "zone_append": false, 00:18:01.245 "compare": false, 00:18:01.245 "compare_and_write": false, 00:18:01.245 "abort": true, 00:18:01.245 "seek_hole": false, 00:18:01.245 "seek_data": false, 00:18:01.245 "copy": true, 00:18:01.245 "nvme_iov_md": false 00:18:01.245 }, 00:18:01.245 "memory_domains": [ 00:18:01.245 { 00:18:01.245 "dma_device_id": "system", 00:18:01.245 "dma_device_type": 1 00:18:01.245 }, 00:18:01.245 { 00:18:01.245 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.245 "dma_device_type": 2 00:18:01.245 } 00:18:01.245 ], 00:18:01.245 "driver_specific": {} 00:18:01.245 } 00:18:01.245 ] 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.245 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.505 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.505 "name": "Existed_Raid", 00:18:01.505 "uuid": "d8d779e2-fddf-4d20-881d-0b451a221aa7", 00:18:01.505 "strip_size_kb": 0, 00:18:01.505 "state": "online", 00:18:01.505 "raid_level": "raid1", 00:18:01.505 "superblock": false, 00:18:01.505 "num_base_bdevs": 3, 00:18:01.505 "num_base_bdevs_discovered": 3, 00:18:01.505 "num_base_bdevs_operational": 3, 00:18:01.505 "base_bdevs_list": [ 00:18:01.505 { 00:18:01.505 "name": "NewBaseBdev", 00:18:01.505 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:18:01.505 "is_configured": true, 00:18:01.505 "data_offset": 0, 00:18:01.505 "data_size": 65536 00:18:01.505 }, 00:18:01.505 { 00:18:01.505 "name": "BaseBdev2", 00:18:01.505 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:18:01.505 "is_configured": true, 00:18:01.505 "data_offset": 0, 00:18:01.505 "data_size": 65536 00:18:01.505 }, 00:18:01.505 { 00:18:01.505 "name": "BaseBdev3", 00:18:01.505 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:18:01.505 "is_configured": true, 00:18:01.505 "data_offset": 0, 00:18:01.505 "data_size": 65536 00:18:01.505 } 00:18:01.505 ] 00:18:01.505 }' 00:18:01.505 11:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.505 11:59:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.072 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:02.072 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:02.072 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:02.072 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:02.072 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:02.072 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:02.072 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:02.072 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:02.332 [2024-07-15 11:59:15.747015] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:02.332 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:02.332 "name": "Existed_Raid", 00:18:02.332 "aliases": [ 00:18:02.332 "d8d779e2-fddf-4d20-881d-0b451a221aa7" 00:18:02.332 ], 00:18:02.332 "product_name": "Raid Volume", 00:18:02.332 "block_size": 512, 00:18:02.332 "num_blocks": 65536, 00:18:02.332 "uuid": "d8d779e2-fddf-4d20-881d-0b451a221aa7", 00:18:02.332 "assigned_rate_limits": { 00:18:02.332 "rw_ios_per_sec": 0, 00:18:02.332 "rw_mbytes_per_sec": 0, 00:18:02.332 "r_mbytes_per_sec": 0, 00:18:02.332 "w_mbytes_per_sec": 0 00:18:02.332 }, 00:18:02.332 "claimed": false, 00:18:02.332 "zoned": false, 00:18:02.332 "supported_io_types": { 00:18:02.332 "read": true, 00:18:02.332 "write": true, 00:18:02.332 "unmap": false, 00:18:02.332 "flush": false, 00:18:02.332 "reset": true, 00:18:02.332 "nvme_admin": false, 00:18:02.332 "nvme_io": false, 00:18:02.332 "nvme_io_md": false, 00:18:02.332 "write_zeroes": true, 00:18:02.332 "zcopy": false, 00:18:02.332 "get_zone_info": false, 00:18:02.332 "zone_management": false, 00:18:02.332 "zone_append": false, 00:18:02.332 "compare": false, 00:18:02.332 "compare_and_write": false, 00:18:02.332 "abort": false, 00:18:02.332 "seek_hole": false, 00:18:02.332 "seek_data": false, 00:18:02.332 "copy": false, 00:18:02.332 "nvme_iov_md": false 00:18:02.332 }, 00:18:02.332 "memory_domains": [ 00:18:02.332 { 00:18:02.332 "dma_device_id": "system", 00:18:02.332 "dma_device_type": 1 00:18:02.332 }, 00:18:02.332 { 00:18:02.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.332 "dma_device_type": 2 00:18:02.332 }, 00:18:02.332 { 00:18:02.332 "dma_device_id": "system", 00:18:02.332 "dma_device_type": 1 00:18:02.332 }, 00:18:02.332 { 00:18:02.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.332 "dma_device_type": 2 00:18:02.332 }, 00:18:02.332 { 00:18:02.332 "dma_device_id": "system", 00:18:02.332 "dma_device_type": 1 00:18:02.332 }, 00:18:02.332 { 00:18:02.332 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.332 "dma_device_type": 2 00:18:02.332 } 00:18:02.332 ], 00:18:02.332 "driver_specific": { 00:18:02.332 "raid": { 00:18:02.332 "uuid": "d8d779e2-fddf-4d20-881d-0b451a221aa7", 00:18:02.332 "strip_size_kb": 0, 00:18:02.332 "state": "online", 00:18:02.332 "raid_level": "raid1", 00:18:02.332 "superblock": false, 00:18:02.332 "num_base_bdevs": 3, 00:18:02.332 "num_base_bdevs_discovered": 3, 00:18:02.332 "num_base_bdevs_operational": 3, 00:18:02.332 "base_bdevs_list": [ 00:18:02.332 { 00:18:02.332 "name": "NewBaseBdev", 00:18:02.332 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:18:02.332 "is_configured": true, 00:18:02.332 "data_offset": 0, 00:18:02.332 "data_size": 65536 00:18:02.332 }, 00:18:02.332 { 00:18:02.332 "name": "BaseBdev2", 00:18:02.332 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:18:02.332 "is_configured": true, 00:18:02.332 "data_offset": 0, 00:18:02.332 "data_size": 65536 00:18:02.332 }, 00:18:02.332 { 00:18:02.332 "name": "BaseBdev3", 00:18:02.332 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:18:02.332 "is_configured": true, 00:18:02.332 "data_offset": 0, 00:18:02.332 "data_size": 65536 00:18:02.332 } 00:18:02.332 ] 00:18:02.332 } 00:18:02.332 } 00:18:02.332 }' 00:18:02.332 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:02.333 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:02.333 BaseBdev2 00:18:02.333 BaseBdev3' 00:18:02.333 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.333 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:02.333 11:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:02.592 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:02.593 "name": "NewBaseBdev", 00:18:02.593 "aliases": [ 00:18:02.593 "6f60690b-7abb-4743-af9c-b042fd464811" 00:18:02.593 ], 00:18:02.593 "product_name": "Malloc disk", 00:18:02.593 "block_size": 512, 00:18:02.593 "num_blocks": 65536, 00:18:02.593 "uuid": "6f60690b-7abb-4743-af9c-b042fd464811", 00:18:02.593 "assigned_rate_limits": { 00:18:02.593 "rw_ios_per_sec": 0, 00:18:02.593 "rw_mbytes_per_sec": 0, 00:18:02.593 "r_mbytes_per_sec": 0, 00:18:02.593 "w_mbytes_per_sec": 0 00:18:02.593 }, 00:18:02.593 "claimed": true, 00:18:02.593 "claim_type": "exclusive_write", 00:18:02.593 "zoned": false, 00:18:02.593 "supported_io_types": { 00:18:02.593 "read": true, 00:18:02.593 "write": true, 00:18:02.593 "unmap": true, 00:18:02.593 "flush": true, 00:18:02.593 "reset": true, 00:18:02.593 "nvme_admin": false, 00:18:02.593 "nvme_io": false, 00:18:02.593 "nvme_io_md": false, 00:18:02.593 "write_zeroes": true, 00:18:02.593 "zcopy": true, 00:18:02.593 "get_zone_info": false, 00:18:02.593 "zone_management": false, 00:18:02.593 "zone_append": false, 00:18:02.593 "compare": false, 00:18:02.593 "compare_and_write": false, 00:18:02.593 "abort": true, 00:18:02.593 "seek_hole": false, 00:18:02.593 "seek_data": false, 00:18:02.593 "copy": true, 00:18:02.593 "nvme_iov_md": false 00:18:02.593 }, 00:18:02.593 "memory_domains": [ 00:18:02.593 { 00:18:02.593 "dma_device_id": "system", 00:18:02.593 "dma_device_type": 1 00:18:02.593 }, 00:18:02.593 { 00:18:02.593 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:02.593 "dma_device_type": 2 00:18:02.593 } 00:18:02.593 ], 00:18:02.593 "driver_specific": {} 00:18:02.593 }' 00:18:02.593 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.593 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:02.593 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:02.593 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:02.852 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:03.111 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:03.111 "name": "BaseBdev2", 00:18:03.111 "aliases": [ 00:18:03.111 "95d09d3a-4aa7-4c2f-951a-ef41c80da560" 00:18:03.111 ], 00:18:03.111 "product_name": "Malloc disk", 00:18:03.111 "block_size": 512, 00:18:03.111 "num_blocks": 65536, 00:18:03.111 "uuid": "95d09d3a-4aa7-4c2f-951a-ef41c80da560", 00:18:03.111 "assigned_rate_limits": { 00:18:03.111 "rw_ios_per_sec": 0, 00:18:03.111 "rw_mbytes_per_sec": 0, 00:18:03.111 "r_mbytes_per_sec": 0, 00:18:03.112 "w_mbytes_per_sec": 0 00:18:03.112 }, 00:18:03.112 "claimed": true, 00:18:03.112 "claim_type": "exclusive_write", 00:18:03.112 "zoned": false, 00:18:03.112 "supported_io_types": { 00:18:03.112 "read": true, 00:18:03.112 "write": true, 00:18:03.112 "unmap": true, 00:18:03.112 "flush": true, 00:18:03.112 "reset": true, 00:18:03.112 "nvme_admin": false, 00:18:03.112 "nvme_io": false, 00:18:03.112 "nvme_io_md": false, 00:18:03.112 "write_zeroes": true, 00:18:03.112 "zcopy": true, 00:18:03.112 "get_zone_info": false, 00:18:03.112 "zone_management": false, 00:18:03.112 "zone_append": false, 00:18:03.112 "compare": false, 00:18:03.112 "compare_and_write": false, 00:18:03.112 "abort": true, 00:18:03.112 "seek_hole": false, 00:18:03.112 "seek_data": false, 00:18:03.112 "copy": true, 00:18:03.112 "nvme_iov_md": false 00:18:03.112 }, 00:18:03.112 "memory_domains": [ 00:18:03.112 { 00:18:03.112 "dma_device_id": "system", 00:18:03.112 "dma_device_type": 1 00:18:03.112 }, 00:18:03.112 { 00:18:03.112 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.112 "dma_device_type": 2 00:18:03.112 } 00:18:03.112 ], 00:18:03.112 "driver_specific": {} 00:18:03.112 }' 00:18:03.112 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.371 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.371 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:03.371 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.371 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.371 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:03.371 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.371 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:03.371 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:03.371 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.630 11:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:03.630 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:03.630 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:03.630 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:03.630 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:03.890 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:03.890 "name": "BaseBdev3", 00:18:03.890 "aliases": [ 00:18:03.890 "dcf5c137-3ace-456c-8417-779ef028cb16" 00:18:03.890 ], 00:18:03.890 "product_name": "Malloc disk", 00:18:03.890 "block_size": 512, 00:18:03.890 "num_blocks": 65536, 00:18:03.890 "uuid": "dcf5c137-3ace-456c-8417-779ef028cb16", 00:18:03.890 "assigned_rate_limits": { 00:18:03.890 "rw_ios_per_sec": 0, 00:18:03.890 "rw_mbytes_per_sec": 0, 00:18:03.890 "r_mbytes_per_sec": 0, 00:18:03.890 "w_mbytes_per_sec": 0 00:18:03.890 }, 00:18:03.890 "claimed": true, 00:18:03.890 "claim_type": "exclusive_write", 00:18:03.890 "zoned": false, 00:18:03.890 "supported_io_types": { 00:18:03.890 "read": true, 00:18:03.890 "write": true, 00:18:03.890 "unmap": true, 00:18:03.890 "flush": true, 00:18:03.890 "reset": true, 00:18:03.890 "nvme_admin": false, 00:18:03.890 "nvme_io": false, 00:18:03.890 "nvme_io_md": false, 00:18:03.890 "write_zeroes": true, 00:18:03.890 "zcopy": true, 00:18:03.890 "get_zone_info": false, 00:18:03.890 "zone_management": false, 00:18:03.890 "zone_append": false, 00:18:03.890 "compare": false, 00:18:03.890 "compare_and_write": false, 00:18:03.890 "abort": true, 00:18:03.890 "seek_hole": false, 00:18:03.890 "seek_data": false, 00:18:03.890 "copy": true, 00:18:03.890 "nvme_iov_md": false 00:18:03.890 }, 00:18:03.890 "memory_domains": [ 00:18:03.890 { 00:18:03.890 "dma_device_id": "system", 00:18:03.890 "dma_device_type": 1 00:18:03.890 }, 00:18:03.890 { 00:18:03.890 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.890 "dma_device_type": 2 00:18:03.890 } 00:18:03.890 ], 00:18:03.890 "driver_specific": {} 00:18:03.890 }' 00:18:03.890 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.890 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:03.890 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:03.890 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.890 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:03.890 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:03.890 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:04.149 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:04.149 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:04.149 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:04.149 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:04.149 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:04.149 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:04.409 [2024-07-15 11:59:17.856330] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:04.409 [2024-07-15 11:59:17.856356] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:04.409 [2024-07-15 11:59:17.856410] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:04.409 [2024-07-15 11:59:17.856674] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:04.409 [2024-07-15 11:59:17.856690] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe0c5e0 name Existed_Raid, state offline 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1502778 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1502778 ']' 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1502778 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1502778 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1502778' 00:18:04.409 killing process with pid 1502778 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1502778 00:18:04.409 [2024-07-15 11:59:17.924245] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:04.409 11:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1502778 00:18:04.409 [2024-07-15 11:59:17.954910] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:04.668 11:59:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:04.668 00:18:04.668 real 0m28.306s 00:18:04.668 user 0m51.854s 00:18:04.668 sys 0m5.129s 00:18:04.668 11:59:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:04.668 11:59:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:04.668 ************************************ 00:18:04.668 END TEST raid_state_function_test 00:18:04.668 ************************************ 00:18:04.668 11:59:18 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:04.668 11:59:18 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:18:04.669 11:59:18 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:04.669 11:59:18 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:04.669 11:59:18 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:04.928 ************************************ 00:18:04.928 START TEST raid_state_function_test_sb 00:18:04.928 ************************************ 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 3 true 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1507058 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1507058' 00:18:04.928 Process raid pid: 1507058 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1507058 /var/tmp/spdk-raid.sock 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1507058 ']' 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:04.928 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:04.928 11:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:04.928 [2024-07-15 11:59:18.347090] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:18:04.928 [2024-07-15 11:59:18.347154] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:04.928 [2024-07-15 11:59:18.477273] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:05.188 [2024-07-15 11:59:18.579803] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:05.188 [2024-07-15 11:59:18.642340] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:05.188 [2024-07-15 11:59:18.642365] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:05.758 11:59:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:05.758 11:59:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:18:05.758 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:06.017 [2024-07-15 11:59:19.500101] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:06.017 [2024-07-15 11:59:19.500150] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:06.017 [2024-07-15 11:59:19.500161] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:06.017 [2024-07-15 11:59:19.500174] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:06.017 [2024-07-15 11:59:19.500182] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:06.017 [2024-07-15 11:59:19.500193] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.017 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.277 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.277 "name": "Existed_Raid", 00:18:06.277 "uuid": "04a01d8e-0227-4935-a46c-63ddae37bb87", 00:18:06.277 "strip_size_kb": 0, 00:18:06.277 "state": "configuring", 00:18:06.277 "raid_level": "raid1", 00:18:06.277 "superblock": true, 00:18:06.277 "num_base_bdevs": 3, 00:18:06.277 "num_base_bdevs_discovered": 0, 00:18:06.277 "num_base_bdevs_operational": 3, 00:18:06.277 "base_bdevs_list": [ 00:18:06.277 { 00:18:06.277 "name": "BaseBdev1", 00:18:06.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.277 "is_configured": false, 00:18:06.277 "data_offset": 0, 00:18:06.277 "data_size": 0 00:18:06.277 }, 00:18:06.277 { 00:18:06.277 "name": "BaseBdev2", 00:18:06.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.277 "is_configured": false, 00:18:06.277 "data_offset": 0, 00:18:06.277 "data_size": 0 00:18:06.277 }, 00:18:06.277 { 00:18:06.277 "name": "BaseBdev3", 00:18:06.277 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.277 "is_configured": false, 00:18:06.277 "data_offset": 0, 00:18:06.277 "data_size": 0 00:18:06.277 } 00:18:06.277 ] 00:18:06.277 }' 00:18:06.277 11:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.277 11:59:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:06.846 11:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:07.105 [2024-07-15 11:59:20.570779] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:07.105 [2024-07-15 11:59:20.570811] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2509b00 name Existed_Raid, state configuring 00:18:07.105 11:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:07.363 [2024-07-15 11:59:20.831492] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:07.363 [2024-07-15 11:59:20.831521] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:07.363 [2024-07-15 11:59:20.831531] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:07.363 [2024-07-15 11:59:20.831550] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:07.363 [2024-07-15 11:59:20.831558] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:07.363 [2024-07-15 11:59:20.831570] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:07.363 11:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:07.622 [2024-07-15 11:59:21.102284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:07.622 BaseBdev1 00:18:07.622 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:07.622 11:59:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:07.622 11:59:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:07.622 11:59:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:07.622 11:59:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:07.622 11:59:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:07.622 11:59:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:07.881 11:59:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:08.140 [ 00:18:08.140 { 00:18:08.140 "name": "BaseBdev1", 00:18:08.140 "aliases": [ 00:18:08.140 "c45cbd4a-c5ed-4cc1-9341-f11e0fa0bf55" 00:18:08.140 ], 00:18:08.140 "product_name": "Malloc disk", 00:18:08.140 "block_size": 512, 00:18:08.140 "num_blocks": 65536, 00:18:08.140 "uuid": "c45cbd4a-c5ed-4cc1-9341-f11e0fa0bf55", 00:18:08.140 "assigned_rate_limits": { 00:18:08.140 "rw_ios_per_sec": 0, 00:18:08.140 "rw_mbytes_per_sec": 0, 00:18:08.140 "r_mbytes_per_sec": 0, 00:18:08.140 "w_mbytes_per_sec": 0 00:18:08.140 }, 00:18:08.140 "claimed": true, 00:18:08.140 "claim_type": "exclusive_write", 00:18:08.140 "zoned": false, 00:18:08.140 "supported_io_types": { 00:18:08.140 "read": true, 00:18:08.140 "write": true, 00:18:08.140 "unmap": true, 00:18:08.140 "flush": true, 00:18:08.140 "reset": true, 00:18:08.140 "nvme_admin": false, 00:18:08.140 "nvme_io": false, 00:18:08.140 "nvme_io_md": false, 00:18:08.140 "write_zeroes": true, 00:18:08.140 "zcopy": true, 00:18:08.140 "get_zone_info": false, 00:18:08.140 "zone_management": false, 00:18:08.140 "zone_append": false, 00:18:08.140 "compare": false, 00:18:08.140 "compare_and_write": false, 00:18:08.140 "abort": true, 00:18:08.140 "seek_hole": false, 00:18:08.140 "seek_data": false, 00:18:08.140 "copy": true, 00:18:08.140 "nvme_iov_md": false 00:18:08.140 }, 00:18:08.140 "memory_domains": [ 00:18:08.140 { 00:18:08.140 "dma_device_id": "system", 00:18:08.140 "dma_device_type": 1 00:18:08.140 }, 00:18:08.140 { 00:18:08.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:08.140 "dma_device_type": 2 00:18:08.140 } 00:18:08.140 ], 00:18:08.140 "driver_specific": {} 00:18:08.140 } 00:18:08.140 ] 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.140 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.399 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.399 "name": "Existed_Raid", 00:18:08.399 "uuid": "b4b4e601-bdfd-49a3-9909-3582facd2845", 00:18:08.399 "strip_size_kb": 0, 00:18:08.399 "state": "configuring", 00:18:08.399 "raid_level": "raid1", 00:18:08.399 "superblock": true, 00:18:08.399 "num_base_bdevs": 3, 00:18:08.399 "num_base_bdevs_discovered": 1, 00:18:08.399 "num_base_bdevs_operational": 3, 00:18:08.399 "base_bdevs_list": [ 00:18:08.399 { 00:18:08.399 "name": "BaseBdev1", 00:18:08.399 "uuid": "c45cbd4a-c5ed-4cc1-9341-f11e0fa0bf55", 00:18:08.399 "is_configured": true, 00:18:08.399 "data_offset": 2048, 00:18:08.399 "data_size": 63488 00:18:08.399 }, 00:18:08.399 { 00:18:08.399 "name": "BaseBdev2", 00:18:08.399 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.399 "is_configured": false, 00:18:08.399 "data_offset": 0, 00:18:08.399 "data_size": 0 00:18:08.399 }, 00:18:08.399 { 00:18:08.400 "name": "BaseBdev3", 00:18:08.400 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:08.400 "is_configured": false, 00:18:08.400 "data_offset": 0, 00:18:08.400 "data_size": 0 00:18:08.400 } 00:18:08.400 ] 00:18:08.400 }' 00:18:08.400 11:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.400 11:59:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:08.968 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:09.228 [2024-07-15 11:59:22.730763] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:09.228 [2024-07-15 11:59:22.730806] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2509390 name Existed_Raid, state configuring 00:18:09.228 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:09.488 [2024-07-15 11:59:22.971437] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:09.488 [2024-07-15 11:59:22.972871] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:09.488 [2024-07-15 11:59:22.972902] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:09.488 [2024-07-15 11:59:22.972912] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:09.488 [2024-07-15 11:59:22.972925] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.488 11:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:09.747 11:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:09.747 "name": "Existed_Raid", 00:18:09.747 "uuid": "705a927c-ed2d-48aa-b741-d0f2041e1b99", 00:18:09.747 "strip_size_kb": 0, 00:18:09.747 "state": "configuring", 00:18:09.747 "raid_level": "raid1", 00:18:09.747 "superblock": true, 00:18:09.747 "num_base_bdevs": 3, 00:18:09.747 "num_base_bdevs_discovered": 1, 00:18:09.747 "num_base_bdevs_operational": 3, 00:18:09.747 "base_bdevs_list": [ 00:18:09.747 { 00:18:09.747 "name": "BaseBdev1", 00:18:09.747 "uuid": "c45cbd4a-c5ed-4cc1-9341-f11e0fa0bf55", 00:18:09.747 "is_configured": true, 00:18:09.747 "data_offset": 2048, 00:18:09.747 "data_size": 63488 00:18:09.747 }, 00:18:09.747 { 00:18:09.747 "name": "BaseBdev2", 00:18:09.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.747 "is_configured": false, 00:18:09.747 "data_offset": 0, 00:18:09.747 "data_size": 0 00:18:09.747 }, 00:18:09.747 { 00:18:09.747 "name": "BaseBdev3", 00:18:09.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:09.747 "is_configured": false, 00:18:09.747 "data_offset": 0, 00:18:09.747 "data_size": 0 00:18:09.747 } 00:18:09.747 ] 00:18:09.747 }' 00:18:09.747 11:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:09.747 11:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:10.316 11:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:10.576 [2024-07-15 11:59:24.073795] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:10.576 BaseBdev2 00:18:10.576 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:10.576 11:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:10.576 11:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:10.576 11:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:10.576 11:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:10.576 11:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:10.576 11:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:10.835 11:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:11.094 [ 00:18:11.094 { 00:18:11.094 "name": "BaseBdev2", 00:18:11.094 "aliases": [ 00:18:11.094 "edf22c7e-c062-4c2a-803b-ced891ad83fa" 00:18:11.094 ], 00:18:11.094 "product_name": "Malloc disk", 00:18:11.094 "block_size": 512, 00:18:11.094 "num_blocks": 65536, 00:18:11.094 "uuid": "edf22c7e-c062-4c2a-803b-ced891ad83fa", 00:18:11.094 "assigned_rate_limits": { 00:18:11.094 "rw_ios_per_sec": 0, 00:18:11.094 "rw_mbytes_per_sec": 0, 00:18:11.094 "r_mbytes_per_sec": 0, 00:18:11.094 "w_mbytes_per_sec": 0 00:18:11.094 }, 00:18:11.094 "claimed": true, 00:18:11.094 "claim_type": "exclusive_write", 00:18:11.094 "zoned": false, 00:18:11.094 "supported_io_types": { 00:18:11.094 "read": true, 00:18:11.094 "write": true, 00:18:11.094 "unmap": true, 00:18:11.094 "flush": true, 00:18:11.094 "reset": true, 00:18:11.094 "nvme_admin": false, 00:18:11.094 "nvme_io": false, 00:18:11.094 "nvme_io_md": false, 00:18:11.094 "write_zeroes": true, 00:18:11.094 "zcopy": true, 00:18:11.094 "get_zone_info": false, 00:18:11.094 "zone_management": false, 00:18:11.094 "zone_append": false, 00:18:11.094 "compare": false, 00:18:11.094 "compare_and_write": false, 00:18:11.094 "abort": true, 00:18:11.094 "seek_hole": false, 00:18:11.094 "seek_data": false, 00:18:11.094 "copy": true, 00:18:11.094 "nvme_iov_md": false 00:18:11.094 }, 00:18:11.094 "memory_domains": [ 00:18:11.094 { 00:18:11.094 "dma_device_id": "system", 00:18:11.094 "dma_device_type": 1 00:18:11.094 }, 00:18:11.094 { 00:18:11.094 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.094 "dma_device_type": 2 00:18:11.094 } 00:18:11.094 ], 00:18:11.094 "driver_specific": {} 00:18:11.094 } 00:18:11.094 ] 00:18:11.094 11:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:11.094 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.095 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.352 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.352 "name": "Existed_Raid", 00:18:11.352 "uuid": "705a927c-ed2d-48aa-b741-d0f2041e1b99", 00:18:11.352 "strip_size_kb": 0, 00:18:11.352 "state": "configuring", 00:18:11.352 "raid_level": "raid1", 00:18:11.352 "superblock": true, 00:18:11.352 "num_base_bdevs": 3, 00:18:11.352 "num_base_bdevs_discovered": 2, 00:18:11.352 "num_base_bdevs_operational": 3, 00:18:11.352 "base_bdevs_list": [ 00:18:11.352 { 00:18:11.352 "name": "BaseBdev1", 00:18:11.352 "uuid": "c45cbd4a-c5ed-4cc1-9341-f11e0fa0bf55", 00:18:11.352 "is_configured": true, 00:18:11.352 "data_offset": 2048, 00:18:11.352 "data_size": 63488 00:18:11.352 }, 00:18:11.352 { 00:18:11.352 "name": "BaseBdev2", 00:18:11.352 "uuid": "edf22c7e-c062-4c2a-803b-ced891ad83fa", 00:18:11.352 "is_configured": true, 00:18:11.352 "data_offset": 2048, 00:18:11.352 "data_size": 63488 00:18:11.352 }, 00:18:11.352 { 00:18:11.352 "name": "BaseBdev3", 00:18:11.352 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.352 "is_configured": false, 00:18:11.352 "data_offset": 0, 00:18:11.352 "data_size": 0 00:18:11.352 } 00:18:11.352 ] 00:18:11.352 }' 00:18:11.352 11:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.352 11:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:11.914 11:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:12.171 [2024-07-15 11:59:25.633375] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:12.171 [2024-07-15 11:59:25.633536] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x250a480 00:18:12.171 [2024-07-15 11:59:25.633550] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:12.171 [2024-07-15 11:59:25.633733] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26bc570 00:18:12.171 [2024-07-15 11:59:25.633857] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250a480 00:18:12.171 [2024-07-15 11:59:25.633867] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x250a480 00:18:12.171 [2024-07-15 11:59:25.633958] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:12.171 BaseBdev3 00:18:12.171 11:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:12.171 11:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:12.171 11:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:12.171 11:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:12.171 11:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:12.171 11:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:12.171 11:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:12.428 11:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:12.686 [ 00:18:12.686 { 00:18:12.686 "name": "BaseBdev3", 00:18:12.686 "aliases": [ 00:18:12.686 "51f4fe29-2567-434a-801f-0c2a257584d0" 00:18:12.686 ], 00:18:12.686 "product_name": "Malloc disk", 00:18:12.686 "block_size": 512, 00:18:12.686 "num_blocks": 65536, 00:18:12.686 "uuid": "51f4fe29-2567-434a-801f-0c2a257584d0", 00:18:12.686 "assigned_rate_limits": { 00:18:12.686 "rw_ios_per_sec": 0, 00:18:12.686 "rw_mbytes_per_sec": 0, 00:18:12.686 "r_mbytes_per_sec": 0, 00:18:12.686 "w_mbytes_per_sec": 0 00:18:12.686 }, 00:18:12.686 "claimed": true, 00:18:12.686 "claim_type": "exclusive_write", 00:18:12.686 "zoned": false, 00:18:12.686 "supported_io_types": { 00:18:12.686 "read": true, 00:18:12.686 "write": true, 00:18:12.686 "unmap": true, 00:18:12.686 "flush": true, 00:18:12.686 "reset": true, 00:18:12.686 "nvme_admin": false, 00:18:12.686 "nvme_io": false, 00:18:12.686 "nvme_io_md": false, 00:18:12.686 "write_zeroes": true, 00:18:12.686 "zcopy": true, 00:18:12.686 "get_zone_info": false, 00:18:12.686 "zone_management": false, 00:18:12.686 "zone_append": false, 00:18:12.686 "compare": false, 00:18:12.686 "compare_and_write": false, 00:18:12.686 "abort": true, 00:18:12.686 "seek_hole": false, 00:18:12.686 "seek_data": false, 00:18:12.686 "copy": true, 00:18:12.686 "nvme_iov_md": false 00:18:12.686 }, 00:18:12.686 "memory_domains": [ 00:18:12.686 { 00:18:12.686 "dma_device_id": "system", 00:18:12.686 "dma_device_type": 1 00:18:12.686 }, 00:18:12.686 { 00:18:12.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:12.686 "dma_device_type": 2 00:18:12.686 } 00:18:12.686 ], 00:18:12.686 "driver_specific": {} 00:18:12.686 } 00:18:12.686 ] 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.686 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:12.942 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.942 "name": "Existed_Raid", 00:18:12.942 "uuid": "705a927c-ed2d-48aa-b741-d0f2041e1b99", 00:18:12.942 "strip_size_kb": 0, 00:18:12.942 "state": "online", 00:18:12.942 "raid_level": "raid1", 00:18:12.942 "superblock": true, 00:18:12.942 "num_base_bdevs": 3, 00:18:12.942 "num_base_bdevs_discovered": 3, 00:18:12.942 "num_base_bdevs_operational": 3, 00:18:12.942 "base_bdevs_list": [ 00:18:12.942 { 00:18:12.942 "name": "BaseBdev1", 00:18:12.942 "uuid": "c45cbd4a-c5ed-4cc1-9341-f11e0fa0bf55", 00:18:12.942 "is_configured": true, 00:18:12.942 "data_offset": 2048, 00:18:12.942 "data_size": 63488 00:18:12.942 }, 00:18:12.942 { 00:18:12.942 "name": "BaseBdev2", 00:18:12.942 "uuid": "edf22c7e-c062-4c2a-803b-ced891ad83fa", 00:18:12.942 "is_configured": true, 00:18:12.942 "data_offset": 2048, 00:18:12.942 "data_size": 63488 00:18:12.942 }, 00:18:12.942 { 00:18:12.942 "name": "BaseBdev3", 00:18:12.942 "uuid": "51f4fe29-2567-434a-801f-0c2a257584d0", 00:18:12.942 "is_configured": true, 00:18:12.942 "data_offset": 2048, 00:18:12.942 "data_size": 63488 00:18:12.942 } 00:18:12.942 ] 00:18:12.942 }' 00:18:12.942 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.942 11:59:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:13.506 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:13.506 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:13.506 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:13.506 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:13.506 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:13.506 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:13.506 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:13.506 11:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:13.763 [2024-07-15 11:59:27.213887] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:13.763 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:13.763 "name": "Existed_Raid", 00:18:13.763 "aliases": [ 00:18:13.763 "705a927c-ed2d-48aa-b741-d0f2041e1b99" 00:18:13.763 ], 00:18:13.763 "product_name": "Raid Volume", 00:18:13.763 "block_size": 512, 00:18:13.763 "num_blocks": 63488, 00:18:13.763 "uuid": "705a927c-ed2d-48aa-b741-d0f2041e1b99", 00:18:13.763 "assigned_rate_limits": { 00:18:13.763 "rw_ios_per_sec": 0, 00:18:13.763 "rw_mbytes_per_sec": 0, 00:18:13.763 "r_mbytes_per_sec": 0, 00:18:13.763 "w_mbytes_per_sec": 0 00:18:13.763 }, 00:18:13.763 "claimed": false, 00:18:13.763 "zoned": false, 00:18:13.763 "supported_io_types": { 00:18:13.763 "read": true, 00:18:13.763 "write": true, 00:18:13.763 "unmap": false, 00:18:13.763 "flush": false, 00:18:13.763 "reset": true, 00:18:13.763 "nvme_admin": false, 00:18:13.763 "nvme_io": false, 00:18:13.763 "nvme_io_md": false, 00:18:13.763 "write_zeroes": true, 00:18:13.763 "zcopy": false, 00:18:13.763 "get_zone_info": false, 00:18:13.763 "zone_management": false, 00:18:13.763 "zone_append": false, 00:18:13.763 "compare": false, 00:18:13.763 "compare_and_write": false, 00:18:13.763 "abort": false, 00:18:13.763 "seek_hole": false, 00:18:13.763 "seek_data": false, 00:18:13.763 "copy": false, 00:18:13.763 "nvme_iov_md": false 00:18:13.763 }, 00:18:13.763 "memory_domains": [ 00:18:13.763 { 00:18:13.763 "dma_device_id": "system", 00:18:13.763 "dma_device_type": 1 00:18:13.763 }, 00:18:13.763 { 00:18:13.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.763 "dma_device_type": 2 00:18:13.763 }, 00:18:13.763 { 00:18:13.763 "dma_device_id": "system", 00:18:13.763 "dma_device_type": 1 00:18:13.763 }, 00:18:13.763 { 00:18:13.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.763 "dma_device_type": 2 00:18:13.763 }, 00:18:13.763 { 00:18:13.763 "dma_device_id": "system", 00:18:13.763 "dma_device_type": 1 00:18:13.763 }, 00:18:13.763 { 00:18:13.763 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:13.763 "dma_device_type": 2 00:18:13.763 } 00:18:13.763 ], 00:18:13.763 "driver_specific": { 00:18:13.763 "raid": { 00:18:13.763 "uuid": "705a927c-ed2d-48aa-b741-d0f2041e1b99", 00:18:13.763 "strip_size_kb": 0, 00:18:13.763 "state": "online", 00:18:13.763 "raid_level": "raid1", 00:18:13.763 "superblock": true, 00:18:13.763 "num_base_bdevs": 3, 00:18:13.763 "num_base_bdevs_discovered": 3, 00:18:13.763 "num_base_bdevs_operational": 3, 00:18:13.763 "base_bdevs_list": [ 00:18:13.763 { 00:18:13.763 "name": "BaseBdev1", 00:18:13.763 "uuid": "c45cbd4a-c5ed-4cc1-9341-f11e0fa0bf55", 00:18:13.763 "is_configured": true, 00:18:13.763 "data_offset": 2048, 00:18:13.763 "data_size": 63488 00:18:13.763 }, 00:18:13.763 { 00:18:13.763 "name": "BaseBdev2", 00:18:13.763 "uuid": "edf22c7e-c062-4c2a-803b-ced891ad83fa", 00:18:13.763 "is_configured": true, 00:18:13.763 "data_offset": 2048, 00:18:13.763 "data_size": 63488 00:18:13.763 }, 00:18:13.763 { 00:18:13.763 "name": "BaseBdev3", 00:18:13.763 "uuid": "51f4fe29-2567-434a-801f-0c2a257584d0", 00:18:13.763 "is_configured": true, 00:18:13.763 "data_offset": 2048, 00:18:13.763 "data_size": 63488 00:18:13.763 } 00:18:13.763 ] 00:18:13.763 } 00:18:13.763 } 00:18:13.763 }' 00:18:13.763 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:13.763 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:13.763 BaseBdev2 00:18:13.763 BaseBdev3' 00:18:13.763 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:13.763 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:13.763 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:14.022 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:14.022 "name": "BaseBdev1", 00:18:14.022 "aliases": [ 00:18:14.022 "c45cbd4a-c5ed-4cc1-9341-f11e0fa0bf55" 00:18:14.022 ], 00:18:14.022 "product_name": "Malloc disk", 00:18:14.022 "block_size": 512, 00:18:14.022 "num_blocks": 65536, 00:18:14.022 "uuid": "c45cbd4a-c5ed-4cc1-9341-f11e0fa0bf55", 00:18:14.022 "assigned_rate_limits": { 00:18:14.022 "rw_ios_per_sec": 0, 00:18:14.022 "rw_mbytes_per_sec": 0, 00:18:14.022 "r_mbytes_per_sec": 0, 00:18:14.022 "w_mbytes_per_sec": 0 00:18:14.022 }, 00:18:14.022 "claimed": true, 00:18:14.022 "claim_type": "exclusive_write", 00:18:14.022 "zoned": false, 00:18:14.022 "supported_io_types": { 00:18:14.022 "read": true, 00:18:14.022 "write": true, 00:18:14.022 "unmap": true, 00:18:14.022 "flush": true, 00:18:14.022 "reset": true, 00:18:14.022 "nvme_admin": false, 00:18:14.022 "nvme_io": false, 00:18:14.022 "nvme_io_md": false, 00:18:14.022 "write_zeroes": true, 00:18:14.022 "zcopy": true, 00:18:14.022 "get_zone_info": false, 00:18:14.022 "zone_management": false, 00:18:14.022 "zone_append": false, 00:18:14.022 "compare": false, 00:18:14.022 "compare_and_write": false, 00:18:14.022 "abort": true, 00:18:14.022 "seek_hole": false, 00:18:14.022 "seek_data": false, 00:18:14.022 "copy": true, 00:18:14.022 "nvme_iov_md": false 00:18:14.022 }, 00:18:14.022 "memory_domains": [ 00:18:14.022 { 00:18:14.022 "dma_device_id": "system", 00:18:14.022 "dma_device_type": 1 00:18:14.022 }, 00:18:14.022 { 00:18:14.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.022 "dma_device_type": 2 00:18:14.022 } 00:18:14.022 ], 00:18:14.022 "driver_specific": {} 00:18:14.022 }' 00:18:14.022 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.022 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.280 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:14.280 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.280 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.280 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:14.280 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.280 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:14.280 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:14.280 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.537 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:14.537 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:14.537 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:14.537 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:14.537 11:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:14.796 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:14.796 "name": "BaseBdev2", 00:18:14.796 "aliases": [ 00:18:14.796 "edf22c7e-c062-4c2a-803b-ced891ad83fa" 00:18:14.796 ], 00:18:14.796 "product_name": "Malloc disk", 00:18:14.796 "block_size": 512, 00:18:14.796 "num_blocks": 65536, 00:18:14.796 "uuid": "edf22c7e-c062-4c2a-803b-ced891ad83fa", 00:18:14.796 "assigned_rate_limits": { 00:18:14.796 "rw_ios_per_sec": 0, 00:18:14.796 "rw_mbytes_per_sec": 0, 00:18:14.796 "r_mbytes_per_sec": 0, 00:18:14.796 "w_mbytes_per_sec": 0 00:18:14.796 }, 00:18:14.796 "claimed": true, 00:18:14.796 "claim_type": "exclusive_write", 00:18:14.796 "zoned": false, 00:18:14.796 "supported_io_types": { 00:18:14.796 "read": true, 00:18:14.796 "write": true, 00:18:14.796 "unmap": true, 00:18:14.796 "flush": true, 00:18:14.796 "reset": true, 00:18:14.796 "nvme_admin": false, 00:18:14.796 "nvme_io": false, 00:18:14.796 "nvme_io_md": false, 00:18:14.796 "write_zeroes": true, 00:18:14.796 "zcopy": true, 00:18:14.796 "get_zone_info": false, 00:18:14.796 "zone_management": false, 00:18:14.796 "zone_append": false, 00:18:14.796 "compare": false, 00:18:14.796 "compare_and_write": false, 00:18:14.796 "abort": true, 00:18:14.796 "seek_hole": false, 00:18:14.796 "seek_data": false, 00:18:14.796 "copy": true, 00:18:14.796 "nvme_iov_md": false 00:18:14.796 }, 00:18:14.796 "memory_domains": [ 00:18:14.796 { 00:18:14.796 "dma_device_id": "system", 00:18:14.796 "dma_device_type": 1 00:18:14.796 }, 00:18:14.796 { 00:18:14.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:14.796 "dma_device_type": 2 00:18:14.796 } 00:18:14.796 ], 00:18:14.796 "driver_specific": {} 00:18:14.796 }' 00:18:14.796 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.796 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:14.796 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:14.796 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:14.796 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:15.055 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:15.314 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:15.314 "name": "BaseBdev3", 00:18:15.314 "aliases": [ 00:18:15.314 "51f4fe29-2567-434a-801f-0c2a257584d0" 00:18:15.314 ], 00:18:15.314 "product_name": "Malloc disk", 00:18:15.314 "block_size": 512, 00:18:15.314 "num_blocks": 65536, 00:18:15.314 "uuid": "51f4fe29-2567-434a-801f-0c2a257584d0", 00:18:15.314 "assigned_rate_limits": { 00:18:15.314 "rw_ios_per_sec": 0, 00:18:15.314 "rw_mbytes_per_sec": 0, 00:18:15.314 "r_mbytes_per_sec": 0, 00:18:15.314 "w_mbytes_per_sec": 0 00:18:15.314 }, 00:18:15.314 "claimed": true, 00:18:15.314 "claim_type": "exclusive_write", 00:18:15.314 "zoned": false, 00:18:15.314 "supported_io_types": { 00:18:15.314 "read": true, 00:18:15.314 "write": true, 00:18:15.314 "unmap": true, 00:18:15.314 "flush": true, 00:18:15.314 "reset": true, 00:18:15.314 "nvme_admin": false, 00:18:15.314 "nvme_io": false, 00:18:15.314 "nvme_io_md": false, 00:18:15.314 "write_zeroes": true, 00:18:15.314 "zcopy": true, 00:18:15.314 "get_zone_info": false, 00:18:15.314 "zone_management": false, 00:18:15.314 "zone_append": false, 00:18:15.314 "compare": false, 00:18:15.314 "compare_and_write": false, 00:18:15.314 "abort": true, 00:18:15.314 "seek_hole": false, 00:18:15.314 "seek_data": false, 00:18:15.314 "copy": true, 00:18:15.314 "nvme_iov_md": false 00:18:15.314 }, 00:18:15.314 "memory_domains": [ 00:18:15.314 { 00:18:15.314 "dma_device_id": "system", 00:18:15.314 "dma_device_type": 1 00:18:15.314 }, 00:18:15.314 { 00:18:15.314 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.314 "dma_device_type": 2 00:18:15.314 } 00:18:15.314 ], 00:18:15.314 "driver_specific": {} 00:18:15.314 }' 00:18:15.315 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.315 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:15.315 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:15.572 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.572 11:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:15.572 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:15.572 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.572 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:15.572 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:15.572 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.572 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:15.829 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:15.829 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:16.088 [2024-07-15 11:59:29.439514] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.088 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.346 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.346 "name": "Existed_Raid", 00:18:16.346 "uuid": "705a927c-ed2d-48aa-b741-d0f2041e1b99", 00:18:16.346 "strip_size_kb": 0, 00:18:16.346 "state": "online", 00:18:16.346 "raid_level": "raid1", 00:18:16.346 "superblock": true, 00:18:16.346 "num_base_bdevs": 3, 00:18:16.346 "num_base_bdevs_discovered": 2, 00:18:16.346 "num_base_bdevs_operational": 2, 00:18:16.346 "base_bdevs_list": [ 00:18:16.346 { 00:18:16.346 "name": null, 00:18:16.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:16.346 "is_configured": false, 00:18:16.346 "data_offset": 2048, 00:18:16.346 "data_size": 63488 00:18:16.346 }, 00:18:16.346 { 00:18:16.346 "name": "BaseBdev2", 00:18:16.346 "uuid": "edf22c7e-c062-4c2a-803b-ced891ad83fa", 00:18:16.346 "is_configured": true, 00:18:16.346 "data_offset": 2048, 00:18:16.346 "data_size": 63488 00:18:16.346 }, 00:18:16.346 { 00:18:16.346 "name": "BaseBdev3", 00:18:16.346 "uuid": "51f4fe29-2567-434a-801f-0c2a257584d0", 00:18:16.346 "is_configured": true, 00:18:16.346 "data_offset": 2048, 00:18:16.346 "data_size": 63488 00:18:16.346 } 00:18:16.346 ] 00:18:16.346 }' 00:18:16.346 11:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.346 11:59:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:17.013 11:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:17.013 11:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:17.013 11:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.013 11:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:17.581 11:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:17.581 11:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:17.581 11:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:17.841 [2024-07-15 11:59:31.386773] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:17.841 11:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:17.841 11:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:17.841 11:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:17.841 11:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:18.407 11:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:18.407 11:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:18.407 11:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:18.974 [2024-07-15 11:59:32.426166] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:18.974 [2024-07-15 11:59:32.426261] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:18.974 [2024-07-15 11:59:32.438809] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:18.974 [2024-07-15 11:59:32.438844] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:18.974 [2024-07-15 11:59:32.438856] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250a480 name Existed_Raid, state offline 00:18:18.974 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:18.974 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:18.974 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.974 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:19.541 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:19.541 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:19.541 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:18:19.541 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:19.541 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:19.541 11:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:20.107 BaseBdev2 00:18:20.107 11:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:20.107 11:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:18:20.107 11:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:20.107 11:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:20.107 11:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:20.107 11:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:20.107 11:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:20.366 11:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:20.625 [ 00:18:20.625 { 00:18:20.625 "name": "BaseBdev2", 00:18:20.625 "aliases": [ 00:18:20.625 "eaba65cb-8a70-4095-8dc2-c1824e72e2c8" 00:18:20.625 ], 00:18:20.625 "product_name": "Malloc disk", 00:18:20.625 "block_size": 512, 00:18:20.625 "num_blocks": 65536, 00:18:20.625 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:20.625 "assigned_rate_limits": { 00:18:20.625 "rw_ios_per_sec": 0, 00:18:20.625 "rw_mbytes_per_sec": 0, 00:18:20.625 "r_mbytes_per_sec": 0, 00:18:20.625 "w_mbytes_per_sec": 0 00:18:20.625 }, 00:18:20.625 "claimed": false, 00:18:20.625 "zoned": false, 00:18:20.625 "supported_io_types": { 00:18:20.625 "read": true, 00:18:20.625 "write": true, 00:18:20.625 "unmap": true, 00:18:20.625 "flush": true, 00:18:20.625 "reset": true, 00:18:20.625 "nvme_admin": false, 00:18:20.625 "nvme_io": false, 00:18:20.625 "nvme_io_md": false, 00:18:20.625 "write_zeroes": true, 00:18:20.625 "zcopy": true, 00:18:20.625 "get_zone_info": false, 00:18:20.625 "zone_management": false, 00:18:20.625 "zone_append": false, 00:18:20.625 "compare": false, 00:18:20.625 "compare_and_write": false, 00:18:20.625 "abort": true, 00:18:20.625 "seek_hole": false, 00:18:20.625 "seek_data": false, 00:18:20.625 "copy": true, 00:18:20.625 "nvme_iov_md": false 00:18:20.625 }, 00:18:20.625 "memory_domains": [ 00:18:20.625 { 00:18:20.625 "dma_device_id": "system", 00:18:20.625 "dma_device_type": 1 00:18:20.625 }, 00:18:20.625 { 00:18:20.625 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.625 "dma_device_type": 2 00:18:20.625 } 00:18:20.625 ], 00:18:20.625 "driver_specific": {} 00:18:20.625 } 00:18:20.625 ] 00:18:20.625 11:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:20.625 11:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:20.625 11:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:20.625 11:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:20.885 BaseBdev3 00:18:20.885 11:59:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:20.885 11:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:18:20.885 11:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:20.885 11:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:20.885 11:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:20.885 11:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:20.885 11:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:21.144 11:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:21.403 [ 00:18:21.403 { 00:18:21.403 "name": "BaseBdev3", 00:18:21.403 "aliases": [ 00:18:21.403 "35a9d605-ed41-44ac-a58a-3a1e9acea1f7" 00:18:21.403 ], 00:18:21.403 "product_name": "Malloc disk", 00:18:21.403 "block_size": 512, 00:18:21.403 "num_blocks": 65536, 00:18:21.403 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:21.403 "assigned_rate_limits": { 00:18:21.403 "rw_ios_per_sec": 0, 00:18:21.403 "rw_mbytes_per_sec": 0, 00:18:21.403 "r_mbytes_per_sec": 0, 00:18:21.403 "w_mbytes_per_sec": 0 00:18:21.403 }, 00:18:21.403 "claimed": false, 00:18:21.403 "zoned": false, 00:18:21.403 "supported_io_types": { 00:18:21.403 "read": true, 00:18:21.403 "write": true, 00:18:21.403 "unmap": true, 00:18:21.403 "flush": true, 00:18:21.403 "reset": true, 00:18:21.403 "nvme_admin": false, 00:18:21.403 "nvme_io": false, 00:18:21.403 "nvme_io_md": false, 00:18:21.403 "write_zeroes": true, 00:18:21.403 "zcopy": true, 00:18:21.403 "get_zone_info": false, 00:18:21.403 "zone_management": false, 00:18:21.403 "zone_append": false, 00:18:21.403 "compare": false, 00:18:21.403 "compare_and_write": false, 00:18:21.403 "abort": true, 00:18:21.403 "seek_hole": false, 00:18:21.403 "seek_data": false, 00:18:21.403 "copy": true, 00:18:21.403 "nvme_iov_md": false 00:18:21.403 }, 00:18:21.403 "memory_domains": [ 00:18:21.403 { 00:18:21.403 "dma_device_id": "system", 00:18:21.403 "dma_device_type": 1 00:18:21.403 }, 00:18:21.403 { 00:18:21.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.403 "dma_device_type": 2 00:18:21.403 } 00:18:21.403 ], 00:18:21.403 "driver_specific": {} 00:18:21.403 } 00:18:21.403 ] 00:18:21.403 11:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:21.403 11:59:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:21.403 11:59:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:21.403 11:59:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:18:21.403 [2024-07-15 11:59:34.987322] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:21.403 [2024-07-15 11:59:34.987361] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:21.403 [2024-07-15 11:59:34.987379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:21.403 [2024-07-15 11:59:34.988698] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.663 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:22.231 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.232 "name": "Existed_Raid", 00:18:22.232 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:22.232 "strip_size_kb": 0, 00:18:22.232 "state": "configuring", 00:18:22.232 "raid_level": "raid1", 00:18:22.232 "superblock": true, 00:18:22.232 "num_base_bdevs": 3, 00:18:22.232 "num_base_bdevs_discovered": 2, 00:18:22.232 "num_base_bdevs_operational": 3, 00:18:22.232 "base_bdevs_list": [ 00:18:22.232 { 00:18:22.232 "name": "BaseBdev1", 00:18:22.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.232 "is_configured": false, 00:18:22.232 "data_offset": 0, 00:18:22.232 "data_size": 0 00:18:22.232 }, 00:18:22.232 { 00:18:22.232 "name": "BaseBdev2", 00:18:22.232 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:22.232 "is_configured": true, 00:18:22.232 "data_offset": 2048, 00:18:22.232 "data_size": 63488 00:18:22.232 }, 00:18:22.232 { 00:18:22.232 "name": "BaseBdev3", 00:18:22.232 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:22.232 "is_configured": true, 00:18:22.232 "data_offset": 2048, 00:18:22.232 "data_size": 63488 00:18:22.232 } 00:18:22.232 ] 00:18:22.232 }' 00:18:22.232 11:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.232 11:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:22.800 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:22.800 [2024-07-15 11:59:36.390995] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:23.059 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.060 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:23.319 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:23.319 "name": "Existed_Raid", 00:18:23.319 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:23.319 "strip_size_kb": 0, 00:18:23.319 "state": "configuring", 00:18:23.319 "raid_level": "raid1", 00:18:23.319 "superblock": true, 00:18:23.319 "num_base_bdevs": 3, 00:18:23.319 "num_base_bdevs_discovered": 1, 00:18:23.319 "num_base_bdevs_operational": 3, 00:18:23.319 "base_bdevs_list": [ 00:18:23.319 { 00:18:23.319 "name": "BaseBdev1", 00:18:23.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:23.319 "is_configured": false, 00:18:23.319 "data_offset": 0, 00:18:23.319 "data_size": 0 00:18:23.319 }, 00:18:23.319 { 00:18:23.319 "name": null, 00:18:23.319 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:23.319 "is_configured": false, 00:18:23.319 "data_offset": 2048, 00:18:23.319 "data_size": 63488 00:18:23.319 }, 00:18:23.319 { 00:18:23.319 "name": "BaseBdev3", 00:18:23.319 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:23.319 "is_configured": true, 00:18:23.319 "data_offset": 2048, 00:18:23.319 "data_size": 63488 00:18:23.319 } 00:18:23.319 ] 00:18:23.319 }' 00:18:23.319 11:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:23.319 11:59:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:23.888 11:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.888 11:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:24.147 11:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:24.147 11:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:24.406 [2024-07-15 11:59:37.822599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:24.406 BaseBdev1 00:18:24.406 11:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:24.406 11:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:18:24.406 11:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:24.406 11:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:24.406 11:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:24.406 11:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:24.406 11:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:24.665 11:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:24.924 [ 00:18:24.924 { 00:18:24.924 "name": "BaseBdev1", 00:18:24.924 "aliases": [ 00:18:24.924 "936cc52d-9c0c-44b5-ab46-251a355dd171" 00:18:24.924 ], 00:18:24.924 "product_name": "Malloc disk", 00:18:24.924 "block_size": 512, 00:18:24.924 "num_blocks": 65536, 00:18:24.924 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:24.924 "assigned_rate_limits": { 00:18:24.924 "rw_ios_per_sec": 0, 00:18:24.924 "rw_mbytes_per_sec": 0, 00:18:24.924 "r_mbytes_per_sec": 0, 00:18:24.924 "w_mbytes_per_sec": 0 00:18:24.924 }, 00:18:24.924 "claimed": true, 00:18:24.924 "claim_type": "exclusive_write", 00:18:24.924 "zoned": false, 00:18:24.924 "supported_io_types": { 00:18:24.924 "read": true, 00:18:24.924 "write": true, 00:18:24.924 "unmap": true, 00:18:24.924 "flush": true, 00:18:24.924 "reset": true, 00:18:24.924 "nvme_admin": false, 00:18:24.924 "nvme_io": false, 00:18:24.924 "nvme_io_md": false, 00:18:24.924 "write_zeroes": true, 00:18:24.924 "zcopy": true, 00:18:24.924 "get_zone_info": false, 00:18:24.924 "zone_management": false, 00:18:24.924 "zone_append": false, 00:18:24.924 "compare": false, 00:18:24.924 "compare_and_write": false, 00:18:24.924 "abort": true, 00:18:24.924 "seek_hole": false, 00:18:24.924 "seek_data": false, 00:18:24.924 "copy": true, 00:18:24.924 "nvme_iov_md": false 00:18:24.924 }, 00:18:24.924 "memory_domains": [ 00:18:24.924 { 00:18:24.924 "dma_device_id": "system", 00:18:24.924 "dma_device_type": 1 00:18:24.924 }, 00:18:24.924 { 00:18:24.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:24.924 "dma_device_type": 2 00:18:24.924 } 00:18:24.924 ], 00:18:24.924 "driver_specific": {} 00:18:24.924 } 00:18:24.924 ] 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:24.924 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.184 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.184 "name": "Existed_Raid", 00:18:25.184 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:25.184 "strip_size_kb": 0, 00:18:25.184 "state": "configuring", 00:18:25.184 "raid_level": "raid1", 00:18:25.184 "superblock": true, 00:18:25.184 "num_base_bdevs": 3, 00:18:25.184 "num_base_bdevs_discovered": 2, 00:18:25.184 "num_base_bdevs_operational": 3, 00:18:25.184 "base_bdevs_list": [ 00:18:25.184 { 00:18:25.184 "name": "BaseBdev1", 00:18:25.184 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:25.184 "is_configured": true, 00:18:25.184 "data_offset": 2048, 00:18:25.184 "data_size": 63488 00:18:25.184 }, 00:18:25.184 { 00:18:25.184 "name": null, 00:18:25.184 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:25.184 "is_configured": false, 00:18:25.184 "data_offset": 2048, 00:18:25.184 "data_size": 63488 00:18:25.184 }, 00:18:25.184 { 00:18:25.184 "name": "BaseBdev3", 00:18:25.184 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:25.184 "is_configured": true, 00:18:25.184 "data_offset": 2048, 00:18:25.184 "data_size": 63488 00:18:25.184 } 00:18:25.184 ] 00:18:25.184 }' 00:18:25.184 11:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.184 11:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:25.752 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.752 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:26.011 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:26.011 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:26.271 [2024-07-15 11:59:39.683591] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.271 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:26.529 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:26.529 "name": "Existed_Raid", 00:18:26.529 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:26.529 "strip_size_kb": 0, 00:18:26.529 "state": "configuring", 00:18:26.529 "raid_level": "raid1", 00:18:26.529 "superblock": true, 00:18:26.529 "num_base_bdevs": 3, 00:18:26.529 "num_base_bdevs_discovered": 1, 00:18:26.529 "num_base_bdevs_operational": 3, 00:18:26.529 "base_bdevs_list": [ 00:18:26.529 { 00:18:26.529 "name": "BaseBdev1", 00:18:26.529 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:26.529 "is_configured": true, 00:18:26.529 "data_offset": 2048, 00:18:26.529 "data_size": 63488 00:18:26.529 }, 00:18:26.529 { 00:18:26.529 "name": null, 00:18:26.529 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:26.529 "is_configured": false, 00:18:26.529 "data_offset": 2048, 00:18:26.529 "data_size": 63488 00:18:26.529 }, 00:18:26.529 { 00:18:26.529 "name": null, 00:18:26.529 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:26.529 "is_configured": false, 00:18:26.529 "data_offset": 2048, 00:18:26.529 "data_size": 63488 00:18:26.529 } 00:18:26.529 ] 00:18:26.529 }' 00:18:26.529 11:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:26.529 11:59:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:27.095 11:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:27.095 11:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.354 11:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:27.354 11:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:27.613 [2024-07-15 11:59:41.207646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:27.871 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.872 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:28.131 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:28.131 "name": "Existed_Raid", 00:18:28.131 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:28.131 "strip_size_kb": 0, 00:18:28.131 "state": "configuring", 00:18:28.131 "raid_level": "raid1", 00:18:28.131 "superblock": true, 00:18:28.131 "num_base_bdevs": 3, 00:18:28.131 "num_base_bdevs_discovered": 2, 00:18:28.131 "num_base_bdevs_operational": 3, 00:18:28.131 "base_bdevs_list": [ 00:18:28.131 { 00:18:28.131 "name": "BaseBdev1", 00:18:28.131 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:28.131 "is_configured": true, 00:18:28.131 "data_offset": 2048, 00:18:28.131 "data_size": 63488 00:18:28.131 }, 00:18:28.131 { 00:18:28.131 "name": null, 00:18:28.131 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:28.131 "is_configured": false, 00:18:28.131 "data_offset": 2048, 00:18:28.131 "data_size": 63488 00:18:28.131 }, 00:18:28.131 { 00:18:28.131 "name": "BaseBdev3", 00:18:28.131 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:28.131 "is_configured": true, 00:18:28.131 "data_offset": 2048, 00:18:28.131 "data_size": 63488 00:18:28.131 } 00:18:28.131 ] 00:18:28.131 }' 00:18:28.131 11:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:28.131 11:59:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:29.067 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.067 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:29.067 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:29.067 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:29.325 [2024-07-15 11:59:42.723698] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.325 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:29.584 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.584 "name": "Existed_Raid", 00:18:29.584 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:29.584 "strip_size_kb": 0, 00:18:29.584 "state": "configuring", 00:18:29.584 "raid_level": "raid1", 00:18:29.584 "superblock": true, 00:18:29.584 "num_base_bdevs": 3, 00:18:29.584 "num_base_bdevs_discovered": 1, 00:18:29.584 "num_base_bdevs_operational": 3, 00:18:29.584 "base_bdevs_list": [ 00:18:29.584 { 00:18:29.584 "name": null, 00:18:29.584 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:29.584 "is_configured": false, 00:18:29.584 "data_offset": 2048, 00:18:29.584 "data_size": 63488 00:18:29.584 }, 00:18:29.584 { 00:18:29.584 "name": null, 00:18:29.584 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:29.584 "is_configured": false, 00:18:29.584 "data_offset": 2048, 00:18:29.584 "data_size": 63488 00:18:29.584 }, 00:18:29.584 { 00:18:29.584 "name": "BaseBdev3", 00:18:29.584 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:29.584 "is_configured": true, 00:18:29.584 "data_offset": 2048, 00:18:29.584 "data_size": 63488 00:18:29.584 } 00:18:29.584 ] 00:18:29.584 }' 00:18:29.584 11:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.584 11:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:30.150 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.150 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:30.150 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:30.150 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:30.409 [2024-07-15 11:59:43.961578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:30.409 11:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:30.985 11:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:30.985 "name": "Existed_Raid", 00:18:30.985 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:30.985 "strip_size_kb": 0, 00:18:30.985 "state": "configuring", 00:18:30.985 "raid_level": "raid1", 00:18:30.985 "superblock": true, 00:18:30.985 "num_base_bdevs": 3, 00:18:30.985 "num_base_bdevs_discovered": 2, 00:18:30.985 "num_base_bdevs_operational": 3, 00:18:30.985 "base_bdevs_list": [ 00:18:30.985 { 00:18:30.985 "name": null, 00:18:30.985 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:30.985 "is_configured": false, 00:18:30.985 "data_offset": 2048, 00:18:30.985 "data_size": 63488 00:18:30.985 }, 00:18:30.985 { 00:18:30.985 "name": "BaseBdev2", 00:18:30.985 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:30.985 "is_configured": true, 00:18:30.985 "data_offset": 2048, 00:18:30.985 "data_size": 63488 00:18:30.986 }, 00:18:30.986 { 00:18:30.986 "name": "BaseBdev3", 00:18:30.986 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:30.986 "is_configured": true, 00:18:30.986 "data_offset": 2048, 00:18:30.986 "data_size": 63488 00:18:30.986 } 00:18:30.986 ] 00:18:30.986 }' 00:18:30.986 11:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:30.986 11:59:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:31.555 11:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.555 11:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:31.814 11:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:31.814 11:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:31.814 11:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:32.074 11:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 936cc52d-9c0c-44b5-ab46-251a355dd171 00:18:32.334 [2024-07-15 11:59:45.867132] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:32.334 [2024-07-15 11:59:45.867282] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x250c7c0 00:18:32.334 [2024-07-15 11:59:45.867295] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:32.334 [2024-07-15 11:59:45.867469] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2509f70 00:18:32.334 [2024-07-15 11:59:45.867597] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x250c7c0 00:18:32.334 [2024-07-15 11:59:45.867607] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x250c7c0 00:18:32.334 [2024-07-15 11:59:45.867721] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:32.334 NewBaseBdev 00:18:32.334 11:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:32.334 11:59:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:18:32.334 11:59:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:18:32.334 11:59:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:18:32.334 11:59:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:18:32.334 11:59:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:18:32.334 11:59:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:32.593 11:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:32.853 [ 00:18:32.853 { 00:18:32.853 "name": "NewBaseBdev", 00:18:32.853 "aliases": [ 00:18:32.853 "936cc52d-9c0c-44b5-ab46-251a355dd171" 00:18:32.853 ], 00:18:32.853 "product_name": "Malloc disk", 00:18:32.853 "block_size": 512, 00:18:32.853 "num_blocks": 65536, 00:18:32.853 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:32.853 "assigned_rate_limits": { 00:18:32.853 "rw_ios_per_sec": 0, 00:18:32.853 "rw_mbytes_per_sec": 0, 00:18:32.853 "r_mbytes_per_sec": 0, 00:18:32.853 "w_mbytes_per_sec": 0 00:18:32.853 }, 00:18:32.853 "claimed": true, 00:18:32.853 "claim_type": "exclusive_write", 00:18:32.853 "zoned": false, 00:18:32.853 "supported_io_types": { 00:18:32.853 "read": true, 00:18:32.853 "write": true, 00:18:32.853 "unmap": true, 00:18:32.853 "flush": true, 00:18:32.853 "reset": true, 00:18:32.853 "nvme_admin": false, 00:18:32.853 "nvme_io": false, 00:18:32.853 "nvme_io_md": false, 00:18:32.853 "write_zeroes": true, 00:18:32.853 "zcopy": true, 00:18:32.853 "get_zone_info": false, 00:18:32.853 "zone_management": false, 00:18:32.853 "zone_append": false, 00:18:32.853 "compare": false, 00:18:32.853 "compare_and_write": false, 00:18:32.853 "abort": true, 00:18:32.853 "seek_hole": false, 00:18:32.853 "seek_data": false, 00:18:32.853 "copy": true, 00:18:32.853 "nvme_iov_md": false 00:18:32.853 }, 00:18:32.853 "memory_domains": [ 00:18:32.853 { 00:18:32.853 "dma_device_id": "system", 00:18:32.853 "dma_device_type": 1 00:18:32.853 }, 00:18:32.853 { 00:18:32.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.853 "dma_device_type": 2 00:18:32.853 } 00:18:32.853 ], 00:18:32.853 "driver_specific": {} 00:18:32.853 } 00:18:32.853 ] 00:18:32.853 11:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:18:32.853 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:18:32.853 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:32.853 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:32.854 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:32.854 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:32.854 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:32.854 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:32.854 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:32.854 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:32.854 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:32.854 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:32.854 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.113 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.113 "name": "Existed_Raid", 00:18:33.113 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:33.113 "strip_size_kb": 0, 00:18:33.113 "state": "online", 00:18:33.113 "raid_level": "raid1", 00:18:33.113 "superblock": true, 00:18:33.113 "num_base_bdevs": 3, 00:18:33.113 "num_base_bdevs_discovered": 3, 00:18:33.113 "num_base_bdevs_operational": 3, 00:18:33.113 "base_bdevs_list": [ 00:18:33.113 { 00:18:33.113 "name": "NewBaseBdev", 00:18:33.113 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:33.113 "is_configured": true, 00:18:33.113 "data_offset": 2048, 00:18:33.113 "data_size": 63488 00:18:33.113 }, 00:18:33.113 { 00:18:33.113 "name": "BaseBdev2", 00:18:33.113 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:33.113 "is_configured": true, 00:18:33.113 "data_offset": 2048, 00:18:33.113 "data_size": 63488 00:18:33.113 }, 00:18:33.113 { 00:18:33.113 "name": "BaseBdev3", 00:18:33.113 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:33.113 "is_configured": true, 00:18:33.113 "data_offset": 2048, 00:18:33.113 "data_size": 63488 00:18:33.113 } 00:18:33.113 ] 00:18:33.113 }' 00:18:33.113 11:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.113 11:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:33.682 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:33.682 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:33.682 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:33.682 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:33.682 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:33.682 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:33.682 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:33.682 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:33.942 [2024-07-15 11:59:47.431573] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:33.942 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:33.942 "name": "Existed_Raid", 00:18:33.942 "aliases": [ 00:18:33.942 "97ef62bb-4a47-4b59-a164-03342bfe85a4" 00:18:33.942 ], 00:18:33.942 "product_name": "Raid Volume", 00:18:33.942 "block_size": 512, 00:18:33.942 "num_blocks": 63488, 00:18:33.942 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:33.942 "assigned_rate_limits": { 00:18:33.942 "rw_ios_per_sec": 0, 00:18:33.942 "rw_mbytes_per_sec": 0, 00:18:33.942 "r_mbytes_per_sec": 0, 00:18:33.942 "w_mbytes_per_sec": 0 00:18:33.942 }, 00:18:33.942 "claimed": false, 00:18:33.942 "zoned": false, 00:18:33.942 "supported_io_types": { 00:18:33.942 "read": true, 00:18:33.942 "write": true, 00:18:33.942 "unmap": false, 00:18:33.942 "flush": false, 00:18:33.942 "reset": true, 00:18:33.942 "nvme_admin": false, 00:18:33.942 "nvme_io": false, 00:18:33.942 "nvme_io_md": false, 00:18:33.942 "write_zeroes": true, 00:18:33.942 "zcopy": false, 00:18:33.942 "get_zone_info": false, 00:18:33.942 "zone_management": false, 00:18:33.942 "zone_append": false, 00:18:33.942 "compare": false, 00:18:33.942 "compare_and_write": false, 00:18:33.942 "abort": false, 00:18:33.942 "seek_hole": false, 00:18:33.942 "seek_data": false, 00:18:33.942 "copy": false, 00:18:33.942 "nvme_iov_md": false 00:18:33.942 }, 00:18:33.942 "memory_domains": [ 00:18:33.942 { 00:18:33.942 "dma_device_id": "system", 00:18:33.942 "dma_device_type": 1 00:18:33.942 }, 00:18:33.942 { 00:18:33.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.942 "dma_device_type": 2 00:18:33.942 }, 00:18:33.942 { 00:18:33.942 "dma_device_id": "system", 00:18:33.942 "dma_device_type": 1 00:18:33.942 }, 00:18:33.942 { 00:18:33.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.942 "dma_device_type": 2 00:18:33.942 }, 00:18:33.942 { 00:18:33.942 "dma_device_id": "system", 00:18:33.942 "dma_device_type": 1 00:18:33.942 }, 00:18:33.942 { 00:18:33.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:33.942 "dma_device_type": 2 00:18:33.942 } 00:18:33.942 ], 00:18:33.942 "driver_specific": { 00:18:33.942 "raid": { 00:18:33.942 "uuid": "97ef62bb-4a47-4b59-a164-03342bfe85a4", 00:18:33.942 "strip_size_kb": 0, 00:18:33.942 "state": "online", 00:18:33.942 "raid_level": "raid1", 00:18:33.942 "superblock": true, 00:18:33.942 "num_base_bdevs": 3, 00:18:33.942 "num_base_bdevs_discovered": 3, 00:18:33.942 "num_base_bdevs_operational": 3, 00:18:33.942 "base_bdevs_list": [ 00:18:33.942 { 00:18:33.942 "name": "NewBaseBdev", 00:18:33.942 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:33.942 "is_configured": true, 00:18:33.942 "data_offset": 2048, 00:18:33.942 "data_size": 63488 00:18:33.942 }, 00:18:33.942 { 00:18:33.942 "name": "BaseBdev2", 00:18:33.942 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:33.942 "is_configured": true, 00:18:33.942 "data_offset": 2048, 00:18:33.942 "data_size": 63488 00:18:33.942 }, 00:18:33.942 { 00:18:33.942 "name": "BaseBdev3", 00:18:33.942 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:33.942 "is_configured": true, 00:18:33.942 "data_offset": 2048, 00:18:33.942 "data_size": 63488 00:18:33.942 } 00:18:33.942 ] 00:18:33.942 } 00:18:33.942 } 00:18:33.942 }' 00:18:33.942 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:33.942 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:33.942 BaseBdev2 00:18:33.942 BaseBdev3' 00:18:33.942 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:33.942 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:33.942 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:34.203 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:34.203 "name": "NewBaseBdev", 00:18:34.203 "aliases": [ 00:18:34.203 "936cc52d-9c0c-44b5-ab46-251a355dd171" 00:18:34.203 ], 00:18:34.203 "product_name": "Malloc disk", 00:18:34.203 "block_size": 512, 00:18:34.203 "num_blocks": 65536, 00:18:34.203 "uuid": "936cc52d-9c0c-44b5-ab46-251a355dd171", 00:18:34.203 "assigned_rate_limits": { 00:18:34.203 "rw_ios_per_sec": 0, 00:18:34.203 "rw_mbytes_per_sec": 0, 00:18:34.203 "r_mbytes_per_sec": 0, 00:18:34.203 "w_mbytes_per_sec": 0 00:18:34.203 }, 00:18:34.203 "claimed": true, 00:18:34.203 "claim_type": "exclusive_write", 00:18:34.203 "zoned": false, 00:18:34.203 "supported_io_types": { 00:18:34.203 "read": true, 00:18:34.203 "write": true, 00:18:34.203 "unmap": true, 00:18:34.203 "flush": true, 00:18:34.203 "reset": true, 00:18:34.203 "nvme_admin": false, 00:18:34.203 "nvme_io": false, 00:18:34.203 "nvme_io_md": false, 00:18:34.203 "write_zeroes": true, 00:18:34.203 "zcopy": true, 00:18:34.203 "get_zone_info": false, 00:18:34.203 "zone_management": false, 00:18:34.203 "zone_append": false, 00:18:34.203 "compare": false, 00:18:34.203 "compare_and_write": false, 00:18:34.203 "abort": true, 00:18:34.203 "seek_hole": false, 00:18:34.203 "seek_data": false, 00:18:34.203 "copy": true, 00:18:34.203 "nvme_iov_md": false 00:18:34.203 }, 00:18:34.203 "memory_domains": [ 00:18:34.203 { 00:18:34.203 "dma_device_id": "system", 00:18:34.203 "dma_device_type": 1 00:18:34.203 }, 00:18:34.203 { 00:18:34.203 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.203 "dma_device_type": 2 00:18:34.203 } 00:18:34.203 ], 00:18:34.203 "driver_specific": {} 00:18:34.203 }' 00:18:34.203 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.463 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.463 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:34.463 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.463 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.463 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:34.463 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.463 11:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.463 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:34.463 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.723 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:34.723 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:34.723 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:34.723 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:34.723 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:34.982 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:34.982 "name": "BaseBdev2", 00:18:34.982 "aliases": [ 00:18:34.982 "eaba65cb-8a70-4095-8dc2-c1824e72e2c8" 00:18:34.982 ], 00:18:34.982 "product_name": "Malloc disk", 00:18:34.982 "block_size": 512, 00:18:34.982 "num_blocks": 65536, 00:18:34.982 "uuid": "eaba65cb-8a70-4095-8dc2-c1824e72e2c8", 00:18:34.982 "assigned_rate_limits": { 00:18:34.982 "rw_ios_per_sec": 0, 00:18:34.982 "rw_mbytes_per_sec": 0, 00:18:34.982 "r_mbytes_per_sec": 0, 00:18:34.982 "w_mbytes_per_sec": 0 00:18:34.982 }, 00:18:34.982 "claimed": true, 00:18:34.982 "claim_type": "exclusive_write", 00:18:34.982 "zoned": false, 00:18:34.982 "supported_io_types": { 00:18:34.982 "read": true, 00:18:34.982 "write": true, 00:18:34.982 "unmap": true, 00:18:34.982 "flush": true, 00:18:34.982 "reset": true, 00:18:34.982 "nvme_admin": false, 00:18:34.982 "nvme_io": false, 00:18:34.982 "nvme_io_md": false, 00:18:34.982 "write_zeroes": true, 00:18:34.982 "zcopy": true, 00:18:34.982 "get_zone_info": false, 00:18:34.982 "zone_management": false, 00:18:34.982 "zone_append": false, 00:18:34.982 "compare": false, 00:18:34.982 "compare_and_write": false, 00:18:34.982 "abort": true, 00:18:34.982 "seek_hole": false, 00:18:34.982 "seek_data": false, 00:18:34.982 "copy": true, 00:18:34.982 "nvme_iov_md": false 00:18:34.982 }, 00:18:34.982 "memory_domains": [ 00:18:34.982 { 00:18:34.982 "dma_device_id": "system", 00:18:34.983 "dma_device_type": 1 00:18:34.983 }, 00:18:34.983 { 00:18:34.983 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:34.983 "dma_device_type": 2 00:18:34.983 } 00:18:34.983 ], 00:18:34.983 "driver_specific": {} 00:18:34.983 }' 00:18:34.983 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.983 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:34.983 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:34.983 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.983 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:34.983 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:34.983 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:34.983 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.242 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:35.242 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.242 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.242 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:35.242 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:35.242 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:35.242 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:35.503 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:35.503 "name": "BaseBdev3", 00:18:35.503 "aliases": [ 00:18:35.503 "35a9d605-ed41-44ac-a58a-3a1e9acea1f7" 00:18:35.503 ], 00:18:35.503 "product_name": "Malloc disk", 00:18:35.503 "block_size": 512, 00:18:35.503 "num_blocks": 65536, 00:18:35.503 "uuid": "35a9d605-ed41-44ac-a58a-3a1e9acea1f7", 00:18:35.503 "assigned_rate_limits": { 00:18:35.503 "rw_ios_per_sec": 0, 00:18:35.503 "rw_mbytes_per_sec": 0, 00:18:35.503 "r_mbytes_per_sec": 0, 00:18:35.503 "w_mbytes_per_sec": 0 00:18:35.503 }, 00:18:35.503 "claimed": true, 00:18:35.503 "claim_type": "exclusive_write", 00:18:35.503 "zoned": false, 00:18:35.503 "supported_io_types": { 00:18:35.503 "read": true, 00:18:35.503 "write": true, 00:18:35.503 "unmap": true, 00:18:35.503 "flush": true, 00:18:35.503 "reset": true, 00:18:35.503 "nvme_admin": false, 00:18:35.503 "nvme_io": false, 00:18:35.503 "nvme_io_md": false, 00:18:35.503 "write_zeroes": true, 00:18:35.503 "zcopy": true, 00:18:35.503 "get_zone_info": false, 00:18:35.503 "zone_management": false, 00:18:35.503 "zone_append": false, 00:18:35.503 "compare": false, 00:18:35.503 "compare_and_write": false, 00:18:35.503 "abort": true, 00:18:35.503 "seek_hole": false, 00:18:35.503 "seek_data": false, 00:18:35.503 "copy": true, 00:18:35.503 "nvme_iov_md": false 00:18:35.503 }, 00:18:35.503 "memory_domains": [ 00:18:35.503 { 00:18:35.503 "dma_device_id": "system", 00:18:35.503 "dma_device_type": 1 00:18:35.503 }, 00:18:35.503 { 00:18:35.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.503 "dma_device_type": 2 00:18:35.503 } 00:18:35.503 ], 00:18:35.503 "driver_specific": {} 00:18:35.503 }' 00:18:35.503 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.503 11:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:35.503 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:35.503 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.503 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:35.763 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:35.763 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.763 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:35.763 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:35.763 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.763 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:35.763 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:35.763 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:36.333 [2024-07-15 11:59:49.781529] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:36.333 [2024-07-15 11:59:49.781556] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:36.333 [2024-07-15 11:59:49.781608] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:36.333 [2024-07-15 11:59:49.781887] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:36.333 [2024-07-15 11:59:49.781900] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x250c7c0 name Existed_Raid, state offline 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1507058 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1507058 ']' 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1507058 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1507058 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1507058' 00:18:36.333 killing process with pid 1507058 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1507058 00:18:36.333 [2024-07-15 11:59:49.862485] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:36.333 11:59:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1507058 00:18:36.333 [2024-07-15 11:59:49.889241] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:36.593 11:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:36.593 00:18:36.594 real 0m31.819s 00:18:36.594 user 0m58.505s 00:18:36.594 sys 0m5.558s 00:18:36.594 11:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:36.594 11:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:36.594 ************************************ 00:18:36.594 END TEST raid_state_function_test_sb 00:18:36.594 ************************************ 00:18:36.594 11:59:50 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:36.594 11:59:50 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:18:36.594 11:59:50 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:18:36.594 11:59:50 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:36.594 11:59:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:36.594 ************************************ 00:18:36.594 START TEST raid_superblock_test 00:18:36.594 ************************************ 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 3 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1511693 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1511693 /var/tmp/spdk-raid.sock 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1511693 ']' 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:36.594 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:36.594 11:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:36.853 [2024-07-15 11:59:50.236289] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:18:36.853 [2024-07-15 11:59:50.236353] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1511693 ] 00:18:36.853 [2024-07-15 11:59:50.353771] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:37.113 [2024-07-15 11:59:50.458111] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:37.113 [2024-07-15 11:59:50.516987] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:37.113 [2024-07-15 11:59:50.517018] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:37.682 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:37.683 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:37.942 malloc1 00:18:37.942 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:38.202 [2024-07-15 11:59:51.652529] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:38.202 [2024-07-15 11:59:51.652578] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:38.202 [2024-07-15 11:59:51.652600] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e8c560 00:18:38.202 [2024-07-15 11:59:51.652613] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:38.202 [2024-07-15 11:59:51.654297] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:38.202 [2024-07-15 11:59:51.654326] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:38.202 pt1 00:18:38.202 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:38.202 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:38.202 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:18:38.202 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:18:38.202 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:38.202 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:38.202 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:38.202 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:38.202 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:38.461 malloc2 00:18:38.461 11:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:38.721 [2024-07-15 11:59:52.134464] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:38.721 [2024-07-15 11:59:52.134509] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:38.721 [2024-07-15 11:59:52.134528] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2a5b0 00:18:38.721 [2024-07-15 11:59:52.134540] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:38.721 [2024-07-15 11:59:52.136076] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:38.721 [2024-07-15 11:59:52.136110] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:38.721 pt2 00:18:38.721 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:38.721 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:38.721 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:18:38.721 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:18:38.721 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:38.721 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:38.721 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:18:38.721 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:38.721 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:38.982 malloc3 00:18:38.982 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:39.242 [2024-07-15 11:59:52.620293] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:39.242 [2024-07-15 11:59:52.620340] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:39.242 [2024-07-15 11:59:52.620359] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2abe0 00:18:39.242 [2024-07-15 11:59:52.620371] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:39.242 [2024-07-15 11:59:52.621954] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:39.242 [2024-07-15 11:59:52.621982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:39.242 pt3 00:18:39.242 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:18:39.242 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:18:39.242 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:18:39.502 [2024-07-15 11:59:52.864952] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:39.502 [2024-07-15 11:59:52.866286] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:39.502 [2024-07-15 11:59:52.866342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:39.502 [2024-07-15 11:59:52.866493] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f2b510 00:18:39.502 [2024-07-15 11:59:52.866504] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:39.503 [2024-07-15 11:59:52.866711] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e8b320 00:18:39.503 [2024-07-15 11:59:52.866856] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f2b510 00:18:39.503 [2024-07-15 11:59:52.866866] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f2b510 00:18:39.503 [2024-07-15 11:59:52.866965] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.503 11:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:39.768 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:39.768 "name": "raid_bdev1", 00:18:39.768 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:39.768 "strip_size_kb": 0, 00:18:39.768 "state": "online", 00:18:39.768 "raid_level": "raid1", 00:18:39.768 "superblock": true, 00:18:39.768 "num_base_bdevs": 3, 00:18:39.768 "num_base_bdevs_discovered": 3, 00:18:39.768 "num_base_bdevs_operational": 3, 00:18:39.768 "base_bdevs_list": [ 00:18:39.768 { 00:18:39.768 "name": "pt1", 00:18:39.768 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:39.768 "is_configured": true, 00:18:39.768 "data_offset": 2048, 00:18:39.768 "data_size": 63488 00:18:39.768 }, 00:18:39.768 { 00:18:39.768 "name": "pt2", 00:18:39.768 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:39.768 "is_configured": true, 00:18:39.768 "data_offset": 2048, 00:18:39.768 "data_size": 63488 00:18:39.768 }, 00:18:39.768 { 00:18:39.768 "name": "pt3", 00:18:39.768 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:39.768 "is_configured": true, 00:18:39.768 "data_offset": 2048, 00:18:39.768 "data_size": 63488 00:18:39.768 } 00:18:39.768 ] 00:18:39.768 }' 00:18:39.768 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:39.768 11:59:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.345 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:18:40.345 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:40.345 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:40.345 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:40.345 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:40.345 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:40.345 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:40.345 11:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:40.630 [2024-07-15 11:59:54.004222] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:40.630 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:40.630 "name": "raid_bdev1", 00:18:40.630 "aliases": [ 00:18:40.630 "d6d9d629-b702-43bf-bef4-d8fc31259b6b" 00:18:40.630 ], 00:18:40.630 "product_name": "Raid Volume", 00:18:40.630 "block_size": 512, 00:18:40.630 "num_blocks": 63488, 00:18:40.630 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:40.630 "assigned_rate_limits": { 00:18:40.630 "rw_ios_per_sec": 0, 00:18:40.630 "rw_mbytes_per_sec": 0, 00:18:40.630 "r_mbytes_per_sec": 0, 00:18:40.630 "w_mbytes_per_sec": 0 00:18:40.630 }, 00:18:40.630 "claimed": false, 00:18:40.630 "zoned": false, 00:18:40.630 "supported_io_types": { 00:18:40.630 "read": true, 00:18:40.630 "write": true, 00:18:40.630 "unmap": false, 00:18:40.630 "flush": false, 00:18:40.630 "reset": true, 00:18:40.630 "nvme_admin": false, 00:18:40.630 "nvme_io": false, 00:18:40.630 "nvme_io_md": false, 00:18:40.630 "write_zeroes": true, 00:18:40.630 "zcopy": false, 00:18:40.630 "get_zone_info": false, 00:18:40.630 "zone_management": false, 00:18:40.630 "zone_append": false, 00:18:40.630 "compare": false, 00:18:40.630 "compare_and_write": false, 00:18:40.630 "abort": false, 00:18:40.630 "seek_hole": false, 00:18:40.630 "seek_data": false, 00:18:40.630 "copy": false, 00:18:40.630 "nvme_iov_md": false 00:18:40.630 }, 00:18:40.630 "memory_domains": [ 00:18:40.630 { 00:18:40.630 "dma_device_id": "system", 00:18:40.630 "dma_device_type": 1 00:18:40.630 }, 00:18:40.630 { 00:18:40.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.630 "dma_device_type": 2 00:18:40.630 }, 00:18:40.630 { 00:18:40.630 "dma_device_id": "system", 00:18:40.630 "dma_device_type": 1 00:18:40.630 }, 00:18:40.630 { 00:18:40.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.630 "dma_device_type": 2 00:18:40.630 }, 00:18:40.630 { 00:18:40.630 "dma_device_id": "system", 00:18:40.630 "dma_device_type": 1 00:18:40.630 }, 00:18:40.630 { 00:18:40.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.630 "dma_device_type": 2 00:18:40.630 } 00:18:40.630 ], 00:18:40.630 "driver_specific": { 00:18:40.630 "raid": { 00:18:40.630 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:40.630 "strip_size_kb": 0, 00:18:40.630 "state": "online", 00:18:40.630 "raid_level": "raid1", 00:18:40.630 "superblock": true, 00:18:40.630 "num_base_bdevs": 3, 00:18:40.630 "num_base_bdevs_discovered": 3, 00:18:40.630 "num_base_bdevs_operational": 3, 00:18:40.630 "base_bdevs_list": [ 00:18:40.630 { 00:18:40.630 "name": "pt1", 00:18:40.630 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:40.630 "is_configured": true, 00:18:40.630 "data_offset": 2048, 00:18:40.630 "data_size": 63488 00:18:40.630 }, 00:18:40.630 { 00:18:40.630 "name": "pt2", 00:18:40.630 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:40.630 "is_configured": true, 00:18:40.630 "data_offset": 2048, 00:18:40.630 "data_size": 63488 00:18:40.630 }, 00:18:40.630 { 00:18:40.630 "name": "pt3", 00:18:40.630 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:40.630 "is_configured": true, 00:18:40.630 "data_offset": 2048, 00:18:40.630 "data_size": 63488 00:18:40.630 } 00:18:40.630 ] 00:18:40.630 } 00:18:40.630 } 00:18:40.630 }' 00:18:40.630 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:40.630 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:40.630 pt2 00:18:40.630 pt3' 00:18:40.630 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:40.630 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:40.630 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:40.908 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:40.908 "name": "pt1", 00:18:40.908 "aliases": [ 00:18:40.908 "00000000-0000-0000-0000-000000000001" 00:18:40.908 ], 00:18:40.908 "product_name": "passthru", 00:18:40.908 "block_size": 512, 00:18:40.908 "num_blocks": 65536, 00:18:40.908 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:40.908 "assigned_rate_limits": { 00:18:40.908 "rw_ios_per_sec": 0, 00:18:40.908 "rw_mbytes_per_sec": 0, 00:18:40.908 "r_mbytes_per_sec": 0, 00:18:40.908 "w_mbytes_per_sec": 0 00:18:40.908 }, 00:18:40.908 "claimed": true, 00:18:40.908 "claim_type": "exclusive_write", 00:18:40.908 "zoned": false, 00:18:40.908 "supported_io_types": { 00:18:40.908 "read": true, 00:18:40.908 "write": true, 00:18:40.908 "unmap": true, 00:18:40.908 "flush": true, 00:18:40.908 "reset": true, 00:18:40.908 "nvme_admin": false, 00:18:40.908 "nvme_io": false, 00:18:40.908 "nvme_io_md": false, 00:18:40.908 "write_zeroes": true, 00:18:40.908 "zcopy": true, 00:18:40.908 "get_zone_info": false, 00:18:40.908 "zone_management": false, 00:18:40.908 "zone_append": false, 00:18:40.908 "compare": false, 00:18:40.908 "compare_and_write": false, 00:18:40.908 "abort": true, 00:18:40.908 "seek_hole": false, 00:18:40.909 "seek_data": false, 00:18:40.909 "copy": true, 00:18:40.909 "nvme_iov_md": false 00:18:40.909 }, 00:18:40.909 "memory_domains": [ 00:18:40.909 { 00:18:40.909 "dma_device_id": "system", 00:18:40.909 "dma_device_type": 1 00:18:40.909 }, 00:18:40.909 { 00:18:40.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:40.909 "dma_device_type": 2 00:18:40.909 } 00:18:40.909 ], 00:18:40.909 "driver_specific": { 00:18:40.909 "passthru": { 00:18:40.909 "name": "pt1", 00:18:40.909 "base_bdev_name": "malloc1" 00:18:40.909 } 00:18:40.909 } 00:18:40.909 }' 00:18:40.909 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:40.909 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:40.909 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:40.909 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:40.909 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:40.909 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:40.909 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:41.169 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:41.169 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:41.169 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:41.169 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:41.169 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:41.169 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:41.169 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:41.169 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:41.429 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:41.429 "name": "pt2", 00:18:41.429 "aliases": [ 00:18:41.429 "00000000-0000-0000-0000-000000000002" 00:18:41.429 ], 00:18:41.429 "product_name": "passthru", 00:18:41.429 "block_size": 512, 00:18:41.429 "num_blocks": 65536, 00:18:41.429 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:41.429 "assigned_rate_limits": { 00:18:41.429 "rw_ios_per_sec": 0, 00:18:41.429 "rw_mbytes_per_sec": 0, 00:18:41.429 "r_mbytes_per_sec": 0, 00:18:41.429 "w_mbytes_per_sec": 0 00:18:41.429 }, 00:18:41.429 "claimed": true, 00:18:41.429 "claim_type": "exclusive_write", 00:18:41.429 "zoned": false, 00:18:41.429 "supported_io_types": { 00:18:41.429 "read": true, 00:18:41.429 "write": true, 00:18:41.429 "unmap": true, 00:18:41.429 "flush": true, 00:18:41.429 "reset": true, 00:18:41.429 "nvme_admin": false, 00:18:41.429 "nvme_io": false, 00:18:41.429 "nvme_io_md": false, 00:18:41.429 "write_zeroes": true, 00:18:41.429 "zcopy": true, 00:18:41.429 "get_zone_info": false, 00:18:41.429 "zone_management": false, 00:18:41.429 "zone_append": false, 00:18:41.429 "compare": false, 00:18:41.429 "compare_and_write": false, 00:18:41.429 "abort": true, 00:18:41.429 "seek_hole": false, 00:18:41.429 "seek_data": false, 00:18:41.429 "copy": true, 00:18:41.429 "nvme_iov_md": false 00:18:41.429 }, 00:18:41.429 "memory_domains": [ 00:18:41.429 { 00:18:41.429 "dma_device_id": "system", 00:18:41.429 "dma_device_type": 1 00:18:41.429 }, 00:18:41.429 { 00:18:41.429 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.429 "dma_device_type": 2 00:18:41.429 } 00:18:41.429 ], 00:18:41.429 "driver_specific": { 00:18:41.429 "passthru": { 00:18:41.429 "name": "pt2", 00:18:41.429 "base_bdev_name": "malloc2" 00:18:41.429 } 00:18:41.429 } 00:18:41.429 }' 00:18:41.429 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:41.429 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:41.429 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:41.429 11:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:41.429 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:41.689 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:41.949 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:41.949 "name": "pt3", 00:18:41.949 "aliases": [ 00:18:41.949 "00000000-0000-0000-0000-000000000003" 00:18:41.949 ], 00:18:41.949 "product_name": "passthru", 00:18:41.949 "block_size": 512, 00:18:41.949 "num_blocks": 65536, 00:18:41.949 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:41.949 "assigned_rate_limits": { 00:18:41.949 "rw_ios_per_sec": 0, 00:18:41.949 "rw_mbytes_per_sec": 0, 00:18:41.949 "r_mbytes_per_sec": 0, 00:18:41.949 "w_mbytes_per_sec": 0 00:18:41.949 }, 00:18:41.949 "claimed": true, 00:18:41.949 "claim_type": "exclusive_write", 00:18:41.949 "zoned": false, 00:18:41.949 "supported_io_types": { 00:18:41.949 "read": true, 00:18:41.949 "write": true, 00:18:41.949 "unmap": true, 00:18:41.949 "flush": true, 00:18:41.949 "reset": true, 00:18:41.949 "nvme_admin": false, 00:18:41.949 "nvme_io": false, 00:18:41.949 "nvme_io_md": false, 00:18:41.949 "write_zeroes": true, 00:18:41.949 "zcopy": true, 00:18:41.949 "get_zone_info": false, 00:18:41.949 "zone_management": false, 00:18:41.949 "zone_append": false, 00:18:41.949 "compare": false, 00:18:41.949 "compare_and_write": false, 00:18:41.949 "abort": true, 00:18:41.949 "seek_hole": false, 00:18:41.949 "seek_data": false, 00:18:41.949 "copy": true, 00:18:41.949 "nvme_iov_md": false 00:18:41.949 }, 00:18:41.949 "memory_domains": [ 00:18:41.949 { 00:18:41.949 "dma_device_id": "system", 00:18:41.949 "dma_device_type": 1 00:18:41.949 }, 00:18:41.949 { 00:18:41.949 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.949 "dma_device_type": 2 00:18:41.949 } 00:18:41.949 ], 00:18:41.949 "driver_specific": { 00:18:41.949 "passthru": { 00:18:41.949 "name": "pt3", 00:18:41.949 "base_bdev_name": "malloc3" 00:18:41.949 } 00:18:41.949 } 00:18:41.949 }' 00:18:41.949 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:41.949 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:42.208 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:42.208 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:42.208 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:42.208 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:42.208 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:42.208 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:42.208 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:42.208 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:42.208 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:42.468 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:42.468 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:42.468 11:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:18:42.468 [2024-07-15 11:59:56.041601] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:42.468 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d6d9d629-b702-43bf-bef4-d8fc31259b6b 00:18:42.468 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z d6d9d629-b702-43bf-bef4-d8fc31259b6b ']' 00:18:42.468 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:42.727 [2024-07-15 11:59:56.277955] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:42.727 [2024-07-15 11:59:56.277980] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:42.727 [2024-07-15 11:59:56.278039] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:42.727 [2024-07-15 11:59:56.278106] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:42.727 [2024-07-15 11:59:56.278118] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2b510 name raid_bdev1, state offline 00:18:42.728 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:42.728 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:18:42.987 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:18:42.987 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:18:42.987 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:42.987 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:43.247 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:43.247 11:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:43.506 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:18:43.506 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:43.766 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:43.766 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:44.025 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:18:44.285 [2024-07-15 11:59:57.729761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:44.285 [2024-07-15 11:59:57.731100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:44.285 [2024-07-15 11:59:57.731142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:44.285 [2024-07-15 11:59:57.731187] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:44.285 [2024-07-15 11:59:57.731224] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:44.285 [2024-07-15 11:59:57.731247] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:44.285 [2024-07-15 11:59:57.731265] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:44.285 [2024-07-15 11:59:57.731275] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2e500 name raid_bdev1, state configuring 00:18:44.285 request: 00:18:44.285 { 00:18:44.285 "name": "raid_bdev1", 00:18:44.285 "raid_level": "raid1", 00:18:44.285 "base_bdevs": [ 00:18:44.285 "malloc1", 00:18:44.285 "malloc2", 00:18:44.285 "malloc3" 00:18:44.285 ], 00:18:44.285 "superblock": false, 00:18:44.285 "method": "bdev_raid_create", 00:18:44.285 "req_id": 1 00:18:44.285 } 00:18:44.285 Got JSON-RPC error response 00:18:44.285 response: 00:18:44.285 { 00:18:44.285 "code": -17, 00:18:44.285 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:44.285 } 00:18:44.285 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:18:44.285 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:18:44.285 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:18:44.285 11:59:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:18:44.285 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.285 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:18:44.545 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:18:44.545 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:18:44.545 11:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:44.805 [2024-07-15 11:59:58.227009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:44.805 [2024-07-15 11:59:58.227051] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:44.805 [2024-07-15 11:59:58.227070] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2c210 00:18:44.805 [2024-07-15 11:59:58.227082] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:44.805 [2024-07-15 11:59:58.228713] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:44.805 [2024-07-15 11:59:58.228742] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:44.805 [2024-07-15 11:59:58.228803] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:44.805 [2024-07-15 11:59:58.228828] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:44.805 pt1 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:44.805 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:45.065 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.065 "name": "raid_bdev1", 00:18:45.065 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:45.065 "strip_size_kb": 0, 00:18:45.065 "state": "configuring", 00:18:45.065 "raid_level": "raid1", 00:18:45.065 "superblock": true, 00:18:45.065 "num_base_bdevs": 3, 00:18:45.065 "num_base_bdevs_discovered": 1, 00:18:45.065 "num_base_bdevs_operational": 3, 00:18:45.065 "base_bdevs_list": [ 00:18:45.065 { 00:18:45.065 "name": "pt1", 00:18:45.065 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:45.065 "is_configured": true, 00:18:45.065 "data_offset": 2048, 00:18:45.065 "data_size": 63488 00:18:45.065 }, 00:18:45.065 { 00:18:45.065 "name": null, 00:18:45.065 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:45.065 "is_configured": false, 00:18:45.065 "data_offset": 2048, 00:18:45.065 "data_size": 63488 00:18:45.065 }, 00:18:45.065 { 00:18:45.065 "name": null, 00:18:45.065 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:45.065 "is_configured": false, 00:18:45.065 "data_offset": 2048, 00:18:45.065 "data_size": 63488 00:18:45.065 } 00:18:45.065 ] 00:18:45.065 }' 00:18:45.065 11:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.065 11:59:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:45.634 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:18:45.634 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:45.894 [2024-07-15 11:59:59.337949] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:45.894 [2024-07-15 11:59:59.337998] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:45.894 [2024-07-15 11:59:59.338025] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e8e9c0 00:18:45.894 [2024-07-15 11:59:59.338037] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:45.894 [2024-07-15 11:59:59.338368] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:45.894 [2024-07-15 11:59:59.338385] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:45.894 [2024-07-15 11:59:59.338444] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:45.894 [2024-07-15 11:59:59.338462] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:45.894 pt2 00:18:45.894 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:46.154 [2024-07-15 11:59:59.582608] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.154 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:46.414 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:46.414 "name": "raid_bdev1", 00:18:46.414 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:46.414 "strip_size_kb": 0, 00:18:46.414 "state": "configuring", 00:18:46.414 "raid_level": "raid1", 00:18:46.414 "superblock": true, 00:18:46.414 "num_base_bdevs": 3, 00:18:46.414 "num_base_bdevs_discovered": 1, 00:18:46.414 "num_base_bdevs_operational": 3, 00:18:46.414 "base_bdevs_list": [ 00:18:46.414 { 00:18:46.414 "name": "pt1", 00:18:46.414 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:46.414 "is_configured": true, 00:18:46.414 "data_offset": 2048, 00:18:46.414 "data_size": 63488 00:18:46.414 }, 00:18:46.414 { 00:18:46.414 "name": null, 00:18:46.414 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:46.414 "is_configured": false, 00:18:46.414 "data_offset": 2048, 00:18:46.414 "data_size": 63488 00:18:46.414 }, 00:18:46.414 { 00:18:46.414 "name": null, 00:18:46.414 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:46.414 "is_configured": false, 00:18:46.414 "data_offset": 2048, 00:18:46.414 "data_size": 63488 00:18:46.414 } 00:18:46.414 ] 00:18:46.414 }' 00:18:46.414 11:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:46.414 11:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.981 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:18:46.981 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:46.981 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:47.239 [2024-07-15 12:00:00.725637] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:47.239 [2024-07-15 12:00:00.725697] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:47.239 [2024-07-15 12:00:00.725720] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e8add0 00:18:47.239 [2024-07-15 12:00:00.725733] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:47.239 [2024-07-15 12:00:00.726072] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:47.239 [2024-07-15 12:00:00.726089] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:47.239 [2024-07-15 12:00:00.726152] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:47.239 [2024-07-15 12:00:00.726170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:47.239 pt2 00:18:47.239 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:47.239 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:47.239 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:47.497 [2024-07-15 12:00:00.970283] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:47.497 [2024-07-15 12:00:00.970321] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:47.497 [2024-07-15 12:00:00.970341] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2f220 00:18:47.497 [2024-07-15 12:00:00.970353] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:47.497 [2024-07-15 12:00:00.970655] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:47.497 [2024-07-15 12:00:00.970673] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:47.497 [2024-07-15 12:00:00.970736] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:47.497 [2024-07-15 12:00:00.970754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:47.497 [2024-07-15 12:00:00.970865] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e8b6b0 00:18:47.497 [2024-07-15 12:00:00.970876] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:47.497 [2024-07-15 12:00:00.971044] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f2db70 00:18:47.497 [2024-07-15 12:00:00.971170] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e8b6b0 00:18:47.497 [2024-07-15 12:00:00.971180] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e8b6b0 00:18:47.497 [2024-07-15 12:00:00.971280] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:47.497 pt3 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.497 12:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:47.761 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:47.761 "name": "raid_bdev1", 00:18:47.761 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:47.761 "strip_size_kb": 0, 00:18:47.761 "state": "online", 00:18:47.761 "raid_level": "raid1", 00:18:47.761 "superblock": true, 00:18:47.761 "num_base_bdevs": 3, 00:18:47.761 "num_base_bdevs_discovered": 3, 00:18:47.761 "num_base_bdevs_operational": 3, 00:18:47.761 "base_bdevs_list": [ 00:18:47.761 { 00:18:47.761 "name": "pt1", 00:18:47.761 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:47.761 "is_configured": true, 00:18:47.761 "data_offset": 2048, 00:18:47.761 "data_size": 63488 00:18:47.761 }, 00:18:47.761 { 00:18:47.761 "name": "pt2", 00:18:47.761 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:47.761 "is_configured": true, 00:18:47.761 "data_offset": 2048, 00:18:47.761 "data_size": 63488 00:18:47.761 }, 00:18:47.761 { 00:18:47.761 "name": "pt3", 00:18:47.761 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:47.761 "is_configured": true, 00:18:47.761 "data_offset": 2048, 00:18:47.761 "data_size": 63488 00:18:47.761 } 00:18:47.761 ] 00:18:47.761 }' 00:18:47.761 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:47.761 12:00:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:48.329 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:18:48.329 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:48.329 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:48.329 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:48.329 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:48.329 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:48.329 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:48.329 12:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:48.589 [2024-07-15 12:00:02.037373] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:48.589 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:48.589 "name": "raid_bdev1", 00:18:48.589 "aliases": [ 00:18:48.589 "d6d9d629-b702-43bf-bef4-d8fc31259b6b" 00:18:48.589 ], 00:18:48.589 "product_name": "Raid Volume", 00:18:48.589 "block_size": 512, 00:18:48.589 "num_blocks": 63488, 00:18:48.589 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:48.589 "assigned_rate_limits": { 00:18:48.589 "rw_ios_per_sec": 0, 00:18:48.589 "rw_mbytes_per_sec": 0, 00:18:48.589 "r_mbytes_per_sec": 0, 00:18:48.589 "w_mbytes_per_sec": 0 00:18:48.589 }, 00:18:48.589 "claimed": false, 00:18:48.589 "zoned": false, 00:18:48.589 "supported_io_types": { 00:18:48.589 "read": true, 00:18:48.589 "write": true, 00:18:48.589 "unmap": false, 00:18:48.589 "flush": false, 00:18:48.589 "reset": true, 00:18:48.589 "nvme_admin": false, 00:18:48.589 "nvme_io": false, 00:18:48.589 "nvme_io_md": false, 00:18:48.589 "write_zeroes": true, 00:18:48.589 "zcopy": false, 00:18:48.589 "get_zone_info": false, 00:18:48.589 "zone_management": false, 00:18:48.589 "zone_append": false, 00:18:48.589 "compare": false, 00:18:48.589 "compare_and_write": false, 00:18:48.589 "abort": false, 00:18:48.589 "seek_hole": false, 00:18:48.589 "seek_data": false, 00:18:48.589 "copy": false, 00:18:48.589 "nvme_iov_md": false 00:18:48.589 }, 00:18:48.589 "memory_domains": [ 00:18:48.589 { 00:18:48.589 "dma_device_id": "system", 00:18:48.589 "dma_device_type": 1 00:18:48.589 }, 00:18:48.589 { 00:18:48.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.589 "dma_device_type": 2 00:18:48.589 }, 00:18:48.589 { 00:18:48.589 "dma_device_id": "system", 00:18:48.589 "dma_device_type": 1 00:18:48.589 }, 00:18:48.589 { 00:18:48.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.589 "dma_device_type": 2 00:18:48.589 }, 00:18:48.589 { 00:18:48.589 "dma_device_id": "system", 00:18:48.589 "dma_device_type": 1 00:18:48.589 }, 00:18:48.589 { 00:18:48.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.589 "dma_device_type": 2 00:18:48.589 } 00:18:48.589 ], 00:18:48.589 "driver_specific": { 00:18:48.589 "raid": { 00:18:48.589 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:48.589 "strip_size_kb": 0, 00:18:48.589 "state": "online", 00:18:48.589 "raid_level": "raid1", 00:18:48.589 "superblock": true, 00:18:48.589 "num_base_bdevs": 3, 00:18:48.589 "num_base_bdevs_discovered": 3, 00:18:48.589 "num_base_bdevs_operational": 3, 00:18:48.589 "base_bdevs_list": [ 00:18:48.589 { 00:18:48.589 "name": "pt1", 00:18:48.589 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:48.589 "is_configured": true, 00:18:48.589 "data_offset": 2048, 00:18:48.589 "data_size": 63488 00:18:48.589 }, 00:18:48.589 { 00:18:48.589 "name": "pt2", 00:18:48.589 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:48.589 "is_configured": true, 00:18:48.589 "data_offset": 2048, 00:18:48.589 "data_size": 63488 00:18:48.589 }, 00:18:48.589 { 00:18:48.589 "name": "pt3", 00:18:48.589 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:48.589 "is_configured": true, 00:18:48.589 "data_offset": 2048, 00:18:48.589 "data_size": 63488 00:18:48.589 } 00:18:48.589 ] 00:18:48.589 } 00:18:48.589 } 00:18:48.589 }' 00:18:48.589 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:48.589 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:48.589 pt2 00:18:48.589 pt3' 00:18:48.589 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:48.589 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:48.589 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:48.849 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:48.849 "name": "pt1", 00:18:48.849 "aliases": [ 00:18:48.849 "00000000-0000-0000-0000-000000000001" 00:18:48.849 ], 00:18:48.849 "product_name": "passthru", 00:18:48.849 "block_size": 512, 00:18:48.849 "num_blocks": 65536, 00:18:48.849 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:48.849 "assigned_rate_limits": { 00:18:48.849 "rw_ios_per_sec": 0, 00:18:48.849 "rw_mbytes_per_sec": 0, 00:18:48.849 "r_mbytes_per_sec": 0, 00:18:48.849 "w_mbytes_per_sec": 0 00:18:48.849 }, 00:18:48.849 "claimed": true, 00:18:48.849 "claim_type": "exclusive_write", 00:18:48.849 "zoned": false, 00:18:48.849 "supported_io_types": { 00:18:48.849 "read": true, 00:18:48.849 "write": true, 00:18:48.849 "unmap": true, 00:18:48.849 "flush": true, 00:18:48.849 "reset": true, 00:18:48.849 "nvme_admin": false, 00:18:48.849 "nvme_io": false, 00:18:48.849 "nvme_io_md": false, 00:18:48.849 "write_zeroes": true, 00:18:48.849 "zcopy": true, 00:18:48.849 "get_zone_info": false, 00:18:48.849 "zone_management": false, 00:18:48.849 "zone_append": false, 00:18:48.849 "compare": false, 00:18:48.849 "compare_and_write": false, 00:18:48.849 "abort": true, 00:18:48.849 "seek_hole": false, 00:18:48.849 "seek_data": false, 00:18:48.849 "copy": true, 00:18:48.849 "nvme_iov_md": false 00:18:48.849 }, 00:18:48.849 "memory_domains": [ 00:18:48.849 { 00:18:48.849 "dma_device_id": "system", 00:18:48.849 "dma_device_type": 1 00:18:48.849 }, 00:18:48.849 { 00:18:48.849 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.849 "dma_device_type": 2 00:18:48.849 } 00:18:48.849 ], 00:18:48.849 "driver_specific": { 00:18:48.849 "passthru": { 00:18:48.849 "name": "pt1", 00:18:48.849 "base_bdev_name": "malloc1" 00:18:48.849 } 00:18:48.849 } 00:18:48.849 }' 00:18:48.849 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:48.849 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.108 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:49.108 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.108 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.108 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:49.108 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.108 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.108 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:49.108 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.108 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.367 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:49.367 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:49.367 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:49.368 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:49.627 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:49.627 "name": "pt2", 00:18:49.627 "aliases": [ 00:18:49.627 "00000000-0000-0000-0000-000000000002" 00:18:49.627 ], 00:18:49.627 "product_name": "passthru", 00:18:49.627 "block_size": 512, 00:18:49.627 "num_blocks": 65536, 00:18:49.627 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:49.627 "assigned_rate_limits": { 00:18:49.627 "rw_ios_per_sec": 0, 00:18:49.627 "rw_mbytes_per_sec": 0, 00:18:49.627 "r_mbytes_per_sec": 0, 00:18:49.627 "w_mbytes_per_sec": 0 00:18:49.627 }, 00:18:49.627 "claimed": true, 00:18:49.627 "claim_type": "exclusive_write", 00:18:49.627 "zoned": false, 00:18:49.627 "supported_io_types": { 00:18:49.627 "read": true, 00:18:49.627 "write": true, 00:18:49.627 "unmap": true, 00:18:49.627 "flush": true, 00:18:49.627 "reset": true, 00:18:49.627 "nvme_admin": false, 00:18:49.627 "nvme_io": false, 00:18:49.627 "nvme_io_md": false, 00:18:49.627 "write_zeroes": true, 00:18:49.627 "zcopy": true, 00:18:49.627 "get_zone_info": false, 00:18:49.627 "zone_management": false, 00:18:49.627 "zone_append": false, 00:18:49.627 "compare": false, 00:18:49.627 "compare_and_write": false, 00:18:49.627 "abort": true, 00:18:49.627 "seek_hole": false, 00:18:49.627 "seek_data": false, 00:18:49.627 "copy": true, 00:18:49.627 "nvme_iov_md": false 00:18:49.627 }, 00:18:49.627 "memory_domains": [ 00:18:49.627 { 00:18:49.627 "dma_device_id": "system", 00:18:49.627 "dma_device_type": 1 00:18:49.627 }, 00:18:49.627 { 00:18:49.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:49.627 "dma_device_type": 2 00:18:49.627 } 00:18:49.627 ], 00:18:49.627 "driver_specific": { 00:18:49.627 "passthru": { 00:18:49.627 "name": "pt2", 00:18:49.627 "base_bdev_name": "malloc2" 00:18:49.627 } 00:18:49.627 } 00:18:49.627 }' 00:18:49.627 12:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.627 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:49.627 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:49.627 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.627 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:49.627 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:49.627 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.627 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:49.886 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:49.886 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.886 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:49.886 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:49.886 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:49.886 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:49.886 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:50.144 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:50.144 "name": "pt3", 00:18:50.144 "aliases": [ 00:18:50.144 "00000000-0000-0000-0000-000000000003" 00:18:50.144 ], 00:18:50.144 "product_name": "passthru", 00:18:50.144 "block_size": 512, 00:18:50.144 "num_blocks": 65536, 00:18:50.144 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:50.144 "assigned_rate_limits": { 00:18:50.144 "rw_ios_per_sec": 0, 00:18:50.144 "rw_mbytes_per_sec": 0, 00:18:50.144 "r_mbytes_per_sec": 0, 00:18:50.144 "w_mbytes_per_sec": 0 00:18:50.144 }, 00:18:50.144 "claimed": true, 00:18:50.144 "claim_type": "exclusive_write", 00:18:50.144 "zoned": false, 00:18:50.144 "supported_io_types": { 00:18:50.144 "read": true, 00:18:50.145 "write": true, 00:18:50.145 "unmap": true, 00:18:50.145 "flush": true, 00:18:50.145 "reset": true, 00:18:50.145 "nvme_admin": false, 00:18:50.145 "nvme_io": false, 00:18:50.145 "nvme_io_md": false, 00:18:50.145 "write_zeroes": true, 00:18:50.145 "zcopy": true, 00:18:50.145 "get_zone_info": false, 00:18:50.145 "zone_management": false, 00:18:50.145 "zone_append": false, 00:18:50.145 "compare": false, 00:18:50.145 "compare_and_write": false, 00:18:50.145 "abort": true, 00:18:50.145 "seek_hole": false, 00:18:50.145 "seek_data": false, 00:18:50.145 "copy": true, 00:18:50.145 "nvme_iov_md": false 00:18:50.145 }, 00:18:50.145 "memory_domains": [ 00:18:50.145 { 00:18:50.145 "dma_device_id": "system", 00:18:50.145 "dma_device_type": 1 00:18:50.145 }, 00:18:50.145 { 00:18:50.145 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.145 "dma_device_type": 2 00:18:50.145 } 00:18:50.145 ], 00:18:50.145 "driver_specific": { 00:18:50.145 "passthru": { 00:18:50.145 "name": "pt3", 00:18:50.145 "base_bdev_name": "malloc3" 00:18:50.145 } 00:18:50.145 } 00:18:50.145 }' 00:18:50.145 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:50.145 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:50.145 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:50.145 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:50.402 12:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:18:50.661 [2024-07-15 12:00:04.219316] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:50.661 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' d6d9d629-b702-43bf-bef4-d8fc31259b6b '!=' d6d9d629-b702-43bf-bef4-d8fc31259b6b ']' 00:18:50.661 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:18:50.661 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:50.661 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:18:50.661 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:50.920 [2024-07-15 12:00:04.467755] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.920 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:51.179 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.179 "name": "raid_bdev1", 00:18:51.179 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:51.179 "strip_size_kb": 0, 00:18:51.179 "state": "online", 00:18:51.179 "raid_level": "raid1", 00:18:51.179 "superblock": true, 00:18:51.179 "num_base_bdevs": 3, 00:18:51.179 "num_base_bdevs_discovered": 2, 00:18:51.179 "num_base_bdevs_operational": 2, 00:18:51.179 "base_bdevs_list": [ 00:18:51.179 { 00:18:51.179 "name": null, 00:18:51.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.179 "is_configured": false, 00:18:51.179 "data_offset": 2048, 00:18:51.179 "data_size": 63488 00:18:51.179 }, 00:18:51.179 { 00:18:51.179 "name": "pt2", 00:18:51.179 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:51.179 "is_configured": true, 00:18:51.179 "data_offset": 2048, 00:18:51.179 "data_size": 63488 00:18:51.179 }, 00:18:51.179 { 00:18:51.179 "name": "pt3", 00:18:51.179 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:51.179 "is_configured": true, 00:18:51.179 "data_offset": 2048, 00:18:51.179 "data_size": 63488 00:18:51.179 } 00:18:51.179 ] 00:18:51.179 }' 00:18:51.179 12:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.179 12:00:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:51.747 12:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:52.006 [2024-07-15 12:00:05.550747] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:52.006 [2024-07-15 12:00:05.550772] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:52.006 [2024-07-15 12:00:05.550829] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:52.006 [2024-07-15 12:00:05.550884] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:52.006 [2024-07-15 12:00:05.550896] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e8b6b0 name raid_bdev1, state offline 00:18:52.006 12:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.006 12:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:18:52.264 12:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:18:52.264 12:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:18:52.264 12:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:18:52.264 12:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:52.264 12:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:52.522 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:52.522 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:52.522 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:52.780 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:18:52.780 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:18:52.780 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:18:52.780 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:52.780 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:53.038 [2024-07-15 12:00:06.553344] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:53.038 [2024-07-15 12:00:06.553392] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:53.038 [2024-07-15 12:00:06.553410] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e8b3f0 00:18:53.038 [2024-07-15 12:00:06.553423] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:53.038 [2024-07-15 12:00:06.555042] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:53.038 [2024-07-15 12:00:06.555071] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:53.038 [2024-07-15 12:00:06.555139] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:53.038 [2024-07-15 12:00:06.555164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:53.038 pt2 00:18:53.038 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:53.038 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:53.038 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:53.038 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:53.038 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:53.039 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:53.039 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.039 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.039 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.039 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.039 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.039 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:53.297 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.297 "name": "raid_bdev1", 00:18:53.297 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:53.297 "strip_size_kb": 0, 00:18:53.297 "state": "configuring", 00:18:53.297 "raid_level": "raid1", 00:18:53.297 "superblock": true, 00:18:53.297 "num_base_bdevs": 3, 00:18:53.297 "num_base_bdevs_discovered": 1, 00:18:53.297 "num_base_bdevs_operational": 2, 00:18:53.297 "base_bdevs_list": [ 00:18:53.297 { 00:18:53.297 "name": null, 00:18:53.297 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:53.297 "is_configured": false, 00:18:53.297 "data_offset": 2048, 00:18:53.297 "data_size": 63488 00:18:53.297 }, 00:18:53.297 { 00:18:53.297 "name": "pt2", 00:18:53.297 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:53.297 "is_configured": true, 00:18:53.297 "data_offset": 2048, 00:18:53.297 "data_size": 63488 00:18:53.297 }, 00:18:53.297 { 00:18:53.297 "name": null, 00:18:53.297 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:53.297 "is_configured": false, 00:18:53.297 "data_offset": 2048, 00:18:53.297 "data_size": 63488 00:18:53.297 } 00:18:53.297 ] 00:18:53.297 }' 00:18:53.297 12:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.297 12:00:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:53.864 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:18:53.864 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:18:53.864 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:18:53.864 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:54.123 [2024-07-15 12:00:07.648267] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:54.123 [2024-07-15 12:00:07.648314] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:54.123 [2024-07-15 12:00:07.648333] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2d690 00:18:54.123 [2024-07-15 12:00:07.648346] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:54.123 [2024-07-15 12:00:07.648679] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:54.123 [2024-07-15 12:00:07.648708] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:54.123 [2024-07-15 12:00:07.648770] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:54.123 [2024-07-15 12:00:07.648789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:54.123 [2024-07-15 12:00:07.648885] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f2d030 00:18:54.123 [2024-07-15 12:00:07.648896] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:54.123 [2024-07-15 12:00:07.649061] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f2f1f0 00:18:54.123 [2024-07-15 12:00:07.649185] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f2d030 00:18:54.123 [2024-07-15 12:00:07.649194] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f2d030 00:18:54.123 [2024-07-15 12:00:07.649291] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:54.123 pt3 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:54.123 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.382 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.382 "name": "raid_bdev1", 00:18:54.382 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:54.382 "strip_size_kb": 0, 00:18:54.382 "state": "online", 00:18:54.382 "raid_level": "raid1", 00:18:54.382 "superblock": true, 00:18:54.382 "num_base_bdevs": 3, 00:18:54.382 "num_base_bdevs_discovered": 2, 00:18:54.382 "num_base_bdevs_operational": 2, 00:18:54.382 "base_bdevs_list": [ 00:18:54.382 { 00:18:54.382 "name": null, 00:18:54.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:54.382 "is_configured": false, 00:18:54.382 "data_offset": 2048, 00:18:54.382 "data_size": 63488 00:18:54.382 }, 00:18:54.382 { 00:18:54.382 "name": "pt2", 00:18:54.382 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:54.382 "is_configured": true, 00:18:54.382 "data_offset": 2048, 00:18:54.382 "data_size": 63488 00:18:54.382 }, 00:18:54.382 { 00:18:54.382 "name": "pt3", 00:18:54.382 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:54.382 "is_configured": true, 00:18:54.382 "data_offset": 2048, 00:18:54.382 "data_size": 63488 00:18:54.382 } 00:18:54.382 ] 00:18:54.382 }' 00:18:54.382 12:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.382 12:00:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:54.950 12:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:55.209 [2024-07-15 12:00:08.747157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:55.209 [2024-07-15 12:00:08.747182] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:55.209 [2024-07-15 12:00:08.747235] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:55.209 [2024-07-15 12:00:08.747289] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:55.209 [2024-07-15 12:00:08.747301] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f2d030 name raid_bdev1, state offline 00:18:55.209 12:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.209 12:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:18:55.467 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:18:55.467 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:18:55.467 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:18:55.467 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:18:55.467 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:55.724 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:55.993 [2024-07-15 12:00:09.497103] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:55.993 [2024-07-15 12:00:09.497152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:55.993 [2024-07-15 12:00:09.497171] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2f780 00:18:55.993 [2024-07-15 12:00:09.497183] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:55.993 [2024-07-15 12:00:09.498785] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:55.993 [2024-07-15 12:00:09.498812] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:55.993 [2024-07-15 12:00:09.498881] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:55.993 [2024-07-15 12:00:09.498908] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:55.993 [2024-07-15 12:00:09.499006] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:18:55.993 [2024-07-15 12:00:09.499019] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:55.993 [2024-07-15 12:00:09.499033] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f31010 name raid_bdev1, state configuring 00:18:55.993 [2024-07-15 12:00:09.499056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:55.993 pt1 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.993 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:56.254 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.254 "name": "raid_bdev1", 00:18:56.254 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:56.254 "strip_size_kb": 0, 00:18:56.254 "state": "configuring", 00:18:56.254 "raid_level": "raid1", 00:18:56.254 "superblock": true, 00:18:56.254 "num_base_bdevs": 3, 00:18:56.254 "num_base_bdevs_discovered": 1, 00:18:56.254 "num_base_bdevs_operational": 2, 00:18:56.254 "base_bdevs_list": [ 00:18:56.254 { 00:18:56.254 "name": null, 00:18:56.254 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:56.254 "is_configured": false, 00:18:56.254 "data_offset": 2048, 00:18:56.254 "data_size": 63488 00:18:56.254 }, 00:18:56.254 { 00:18:56.254 "name": "pt2", 00:18:56.254 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:56.254 "is_configured": true, 00:18:56.254 "data_offset": 2048, 00:18:56.254 "data_size": 63488 00:18:56.254 }, 00:18:56.254 { 00:18:56.254 "name": null, 00:18:56.254 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:56.254 "is_configured": false, 00:18:56.254 "data_offset": 2048, 00:18:56.254 "data_size": 63488 00:18:56.254 } 00:18:56.254 ] 00:18:56.254 }' 00:18:56.254 12:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.254 12:00:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:56.820 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:18:56.820 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:57.078 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:18:57.078 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:57.336 [2024-07-15 12:00:10.716332] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:57.336 [2024-07-15 12:00:10.716387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:57.336 [2024-07-15 12:00:10.716406] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f2f9b0 00:18:57.336 [2024-07-15 12:00:10.716424] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:57.336 [2024-07-15 12:00:10.716782] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:57.336 [2024-07-15 12:00:10.716806] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:57.336 [2024-07-15 12:00:10.716866] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:57.336 [2024-07-15 12:00:10.716886] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:57.336 [2024-07-15 12:00:10.716980] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e8e600 00:18:57.336 [2024-07-15 12:00:10.716991] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:18:57.336 [2024-07-15 12:00:10.717156] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f2afe0 00:18:57.336 [2024-07-15 12:00:10.717280] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e8e600 00:18:57.336 [2024-07-15 12:00:10.717290] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e8e600 00:18:57.336 [2024-07-15 12:00:10.717386] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:57.336 pt3 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:57.336 "name": "raid_bdev1", 00:18:57.336 "uuid": "d6d9d629-b702-43bf-bef4-d8fc31259b6b", 00:18:57.336 "strip_size_kb": 0, 00:18:57.336 "state": "online", 00:18:57.336 "raid_level": "raid1", 00:18:57.336 "superblock": true, 00:18:57.336 "num_base_bdevs": 3, 00:18:57.336 "num_base_bdevs_discovered": 2, 00:18:57.336 "num_base_bdevs_operational": 2, 00:18:57.336 "base_bdevs_list": [ 00:18:57.336 { 00:18:57.336 "name": null, 00:18:57.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:57.336 "is_configured": false, 00:18:57.336 "data_offset": 2048, 00:18:57.336 "data_size": 63488 00:18:57.336 }, 00:18:57.336 { 00:18:57.336 "name": "pt2", 00:18:57.336 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:57.336 "is_configured": true, 00:18:57.336 "data_offset": 2048, 00:18:57.336 "data_size": 63488 00:18:57.336 }, 00:18:57.336 { 00:18:57.336 "name": "pt3", 00:18:57.336 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:57.336 "is_configured": true, 00:18:57.336 "data_offset": 2048, 00:18:57.336 "data_size": 63488 00:18:57.336 } 00:18:57.336 ] 00:18:57.336 }' 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:57.336 12:00:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.269 12:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:18:58.269 12:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:18:58.269 12:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:18:58.269 12:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:58.269 12:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:18:58.529 [2024-07-15 12:00:11.999998] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' d6d9d629-b702-43bf-bef4-d8fc31259b6b '!=' d6d9d629-b702-43bf-bef4-d8fc31259b6b ']' 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1511693 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1511693 ']' 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1511693 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1511693 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1511693' 00:18:58.529 killing process with pid 1511693 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1511693 00:18:58.529 [2024-07-15 12:00:12.069977] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:58.529 [2024-07-15 12:00:12.070034] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:58.529 [2024-07-15 12:00:12.070091] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:58.529 [2024-07-15 12:00:12.070103] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e8e600 name raid_bdev1, state offline 00:18:58.529 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1511693 00:18:58.529 [2024-07-15 12:00:12.100780] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:58.787 12:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:18:58.787 00:18:58.787 real 0m22.149s 00:18:58.787 user 0m40.452s 00:18:58.787 sys 0m3.995s 00:18:58.787 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:18:58.787 12:00:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.787 ************************************ 00:18:58.787 END TEST raid_superblock_test 00:18:58.787 ************************************ 00:18:58.787 12:00:12 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:18:58.787 12:00:12 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:18:58.787 12:00:12 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:18:58.787 12:00:12 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:18:58.787 12:00:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:59.047 ************************************ 00:18:59.047 START TEST raid_read_error_test 00:18:59.047 ************************************ 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 read 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.CqqHlYif9r 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1515525 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1515525 /var/tmp/spdk-raid.sock 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1515525 ']' 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:59.047 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:18:59.047 12:00:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:59.047 [2024-07-15 12:00:12.489336] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:18:59.047 [2024-07-15 12:00:12.489401] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1515525 ] 00:18:59.047 [2024-07-15 12:00:12.619966] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:59.305 [2024-07-15 12:00:12.726001] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:59.305 [2024-07-15 12:00:12.787815] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:59.305 [2024-07-15 12:00:12.787850] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:59.870 12:00:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:18:59.870 12:00:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:18:59.870 12:00:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:18:59.870 12:00:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:00.128 BaseBdev1_malloc 00:19:00.128 12:00:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:00.386 true 00:19:00.386 12:00:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:00.645 [2024-07-15 12:00:14.121046] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:00.645 [2024-07-15 12:00:14.121093] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:00.645 [2024-07-15 12:00:14.121115] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x269e4e0 00:19:00.645 [2024-07-15 12:00:14.121128] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:00.645 [2024-07-15 12:00:14.122932] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:00.645 [2024-07-15 12:00:14.122962] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:00.645 BaseBdev1 00:19:00.645 12:00:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:00.645 12:00:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:00.904 BaseBdev2_malloc 00:19:00.905 12:00:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:01.163 true 00:19:01.163 12:00:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:01.422 [2024-07-15 12:00:14.880817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:01.422 [2024-07-15 12:00:14.880860] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:01.422 [2024-07-15 12:00:14.880880] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a37b0 00:19:01.422 [2024-07-15 12:00:14.880893] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:01.422 [2024-07-15 12:00:14.882449] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:01.422 [2024-07-15 12:00:14.882477] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:01.422 BaseBdev2 00:19:01.422 12:00:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:01.422 12:00:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:01.680 BaseBdev3_malloc 00:19:01.680 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:01.938 true 00:19:01.938 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:02.197 [2024-07-15 12:00:15.619319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:02.197 [2024-07-15 12:00:15.619363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:02.197 [2024-07-15 12:00:15.619387] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26a58f0 00:19:02.197 [2024-07-15 12:00:15.619400] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:02.197 [2024-07-15 12:00:15.621023] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:02.197 [2024-07-15 12:00:15.621051] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:02.197 BaseBdev3 00:19:02.197 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:19:02.456 [2024-07-15 12:00:15.859974] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:02.456 [2024-07-15 12:00:15.861326] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:02.456 [2024-07-15 12:00:15.861400] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:02.456 [2024-07-15 12:00:15.861601] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26a74b0 00:19:02.456 [2024-07-15 12:00:15.861612] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:02.456 [2024-07-15 12:00:15.861817] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a7450 00:19:02.456 [2024-07-15 12:00:15.861976] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26a74b0 00:19:02.456 [2024-07-15 12:00:15.861986] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26a74b0 00:19:02.456 [2024-07-15 12:00:15.862091] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:02.456 12:00:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:02.715 12:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:02.715 "name": "raid_bdev1", 00:19:02.715 "uuid": "72d01449-3b73-4e7d-b902-2883e4b1ad61", 00:19:02.715 "strip_size_kb": 0, 00:19:02.716 "state": "online", 00:19:02.716 "raid_level": "raid1", 00:19:02.716 "superblock": true, 00:19:02.716 "num_base_bdevs": 3, 00:19:02.716 "num_base_bdevs_discovered": 3, 00:19:02.716 "num_base_bdevs_operational": 3, 00:19:02.716 "base_bdevs_list": [ 00:19:02.716 { 00:19:02.716 "name": "BaseBdev1", 00:19:02.716 "uuid": "3e1f1de7-c6dc-5eae-913c-6193cfb3acf3", 00:19:02.716 "is_configured": true, 00:19:02.716 "data_offset": 2048, 00:19:02.716 "data_size": 63488 00:19:02.716 }, 00:19:02.716 { 00:19:02.716 "name": "BaseBdev2", 00:19:02.716 "uuid": "ced254e2-68a0-5633-bf4f-5bdae483bcd5", 00:19:02.716 "is_configured": true, 00:19:02.716 "data_offset": 2048, 00:19:02.716 "data_size": 63488 00:19:02.716 }, 00:19:02.716 { 00:19:02.716 "name": "BaseBdev3", 00:19:02.716 "uuid": "877883a9-d506-58ec-bc56-a363c4456b6b", 00:19:02.716 "is_configured": true, 00:19:02.716 "data_offset": 2048, 00:19:02.716 "data_size": 63488 00:19:02.716 } 00:19:02.716 ] 00:19:02.716 }' 00:19:02.716 12:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:02.716 12:00:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:03.329 12:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:03.329 12:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:03.329 [2024-07-15 12:00:16.818799] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ac210 00:19:04.321 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:04.580 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:04.580 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:19:04.580 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:19:04.580 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:19:04.580 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:04.580 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:04.580 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:04.580 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:04.580 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:04.581 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:04.581 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:04.581 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:04.581 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:04.581 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:04.581 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:04.581 12:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:04.839 12:00:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:04.839 "name": "raid_bdev1", 00:19:04.839 "uuid": "72d01449-3b73-4e7d-b902-2883e4b1ad61", 00:19:04.839 "strip_size_kb": 0, 00:19:04.839 "state": "online", 00:19:04.839 "raid_level": "raid1", 00:19:04.839 "superblock": true, 00:19:04.839 "num_base_bdevs": 3, 00:19:04.839 "num_base_bdevs_discovered": 3, 00:19:04.839 "num_base_bdevs_operational": 3, 00:19:04.839 "base_bdevs_list": [ 00:19:04.839 { 00:19:04.839 "name": "BaseBdev1", 00:19:04.839 "uuid": "3e1f1de7-c6dc-5eae-913c-6193cfb3acf3", 00:19:04.839 "is_configured": true, 00:19:04.839 "data_offset": 2048, 00:19:04.839 "data_size": 63488 00:19:04.839 }, 00:19:04.839 { 00:19:04.839 "name": "BaseBdev2", 00:19:04.839 "uuid": "ced254e2-68a0-5633-bf4f-5bdae483bcd5", 00:19:04.839 "is_configured": true, 00:19:04.839 "data_offset": 2048, 00:19:04.839 "data_size": 63488 00:19:04.839 }, 00:19:04.839 { 00:19:04.839 "name": "BaseBdev3", 00:19:04.839 "uuid": "877883a9-d506-58ec-bc56-a363c4456b6b", 00:19:04.839 "is_configured": true, 00:19:04.839 "data_offset": 2048, 00:19:04.839 "data_size": 63488 00:19:04.839 } 00:19:04.839 ] 00:19:04.839 }' 00:19:04.839 12:00:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:04.839 12:00:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.414 12:00:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:05.681 [2024-07-15 12:00:19.055863] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:05.681 [2024-07-15 12:00:19.055899] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:05.681 [2024-07-15 12:00:19.059033] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:05.681 [2024-07-15 12:00:19.059067] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:05.681 [2024-07-15 12:00:19.059164] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:05.681 [2024-07-15 12:00:19.059175] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26a74b0 name raid_bdev1, state offline 00:19:05.681 0 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1515525 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1515525 ']' 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1515525 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1515525 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1515525' 00:19:05.681 killing process with pid 1515525 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1515525 00:19:05.681 [2024-07-15 12:00:19.128586] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:05.681 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1515525 00:19:05.681 [2024-07-15 12:00:19.149228] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.CqqHlYif9r 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:19:05.939 00:19:05.939 real 0m6.971s 00:19:05.939 user 0m11.064s 00:19:05.939 sys 0m1.215s 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:05.939 12:00:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.939 ************************************ 00:19:05.939 END TEST raid_read_error_test 00:19:05.939 ************************************ 00:19:05.939 12:00:19 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:05.939 12:00:19 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:19:05.939 12:00:19 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:05.939 12:00:19 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:05.939 12:00:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:05.939 ************************************ 00:19:05.939 START TEST raid_write_error_test 00:19:05.939 ************************************ 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 3 write 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.j0ErnAATqo 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1516582 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1516582 /var/tmp/spdk-raid.sock 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1516582 ']' 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:05.939 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:05.939 12:00:19 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.197 [2024-07-15 12:00:19.547224] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:19:06.197 [2024-07-15 12:00:19.547283] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1516582 ] 00:19:06.197 [2024-07-15 12:00:19.660987] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:06.197 [2024-07-15 12:00:19.763166] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:06.457 [2024-07-15 12:00:19.828533] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:06.457 [2024-07-15 12:00:19.828571] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:07.025 12:00:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:07.025 12:00:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:19:07.025 12:00:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:07.025 12:00:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:07.284 BaseBdev1_malloc 00:19:07.284 12:00:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:07.543 true 00:19:07.543 12:00:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:07.801 [2024-07-15 12:00:21.198140] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:07.801 [2024-07-15 12:00:21.198187] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:07.802 [2024-07-15 12:00:21.198206] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27f24e0 00:19:07.802 [2024-07-15 12:00:21.198218] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:07.802 [2024-07-15 12:00:21.199848] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:07.802 [2024-07-15 12:00:21.199876] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:07.802 BaseBdev1 00:19:07.802 12:00:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:07.802 12:00:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:08.060 BaseBdev2_malloc 00:19:08.060 12:00:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:08.318 true 00:19:08.318 12:00:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:08.576 [2024-07-15 12:00:21.944617] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:08.576 [2024-07-15 12:00:21.944655] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:08.576 [2024-07-15 12:00:21.944673] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27f77b0 00:19:08.576 [2024-07-15 12:00:21.944691] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:08.576 [2024-07-15 12:00:21.946090] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:08.576 [2024-07-15 12:00:21.946117] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:08.576 BaseBdev2 00:19:08.576 12:00:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:08.576 12:00:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:08.835 BaseBdev3_malloc 00:19:08.835 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:09.095 true 00:19:09.095 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:09.095 [2024-07-15 12:00:22.683285] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:09.095 [2024-07-15 12:00:22.683331] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:09.095 [2024-07-15 12:00:22.683351] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27f98f0 00:19:09.095 [2024-07-15 12:00:22.683363] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:09.095 [2024-07-15 12:00:22.684770] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:09.095 [2024-07-15 12:00:22.684796] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:09.095 BaseBdev3 00:19:09.356 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:19:09.356 [2024-07-15 12:00:22.935978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:09.356 [2024-07-15 12:00:22.937166] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:09.356 [2024-07-15 12:00:22.937230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:09.356 [2024-07-15 12:00:22.937425] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27fb4b0 00:19:09.356 [2024-07-15 12:00:22.937436] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:19:09.356 [2024-07-15 12:00:22.937621] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27fb450 00:19:09.356 [2024-07-15 12:00:22.937781] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27fb4b0 00:19:09.356 [2024-07-15 12:00:22.937792] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27fb4b0 00:19:09.356 [2024-07-15 12:00:22.937890] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.615 12:00:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:09.883 12:00:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.883 "name": "raid_bdev1", 00:19:09.883 "uuid": "2337d5d9-6d9f-415b-a15e-dba4fc198bf5", 00:19:09.884 "strip_size_kb": 0, 00:19:09.884 "state": "online", 00:19:09.884 "raid_level": "raid1", 00:19:09.884 "superblock": true, 00:19:09.884 "num_base_bdevs": 3, 00:19:09.884 "num_base_bdevs_discovered": 3, 00:19:09.884 "num_base_bdevs_operational": 3, 00:19:09.884 "base_bdevs_list": [ 00:19:09.884 { 00:19:09.884 "name": "BaseBdev1", 00:19:09.884 "uuid": "3ad2a6d9-443d-5a50-80ba-62107444a1e4", 00:19:09.884 "is_configured": true, 00:19:09.884 "data_offset": 2048, 00:19:09.884 "data_size": 63488 00:19:09.884 }, 00:19:09.884 { 00:19:09.884 "name": "BaseBdev2", 00:19:09.884 "uuid": "a876fb6a-8554-560a-a01b-60815699e8ba", 00:19:09.884 "is_configured": true, 00:19:09.884 "data_offset": 2048, 00:19:09.884 "data_size": 63488 00:19:09.884 }, 00:19:09.884 { 00:19:09.884 "name": "BaseBdev3", 00:19:09.884 "uuid": "bc68524c-4906-5419-9c81-93f385efdb17", 00:19:09.884 "is_configured": true, 00:19:09.884 "data_offset": 2048, 00:19:09.884 "data_size": 63488 00:19:09.884 } 00:19:09.884 ] 00:19:09.884 }' 00:19:09.884 12:00:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.884 12:00:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.451 12:00:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:10.451 12:00:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:10.451 [2024-07-15 12:00:23.922915] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2800210 00:19:11.387 12:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:11.646 [2024-07-15 12:00:24.987091] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:19:11.646 [2024-07-15 12:00:24.987155] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:11.646 [2024-07-15 12:00:24.987351] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2800210 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.646 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:11.904 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.904 "name": "raid_bdev1", 00:19:11.904 "uuid": "2337d5d9-6d9f-415b-a15e-dba4fc198bf5", 00:19:11.904 "strip_size_kb": 0, 00:19:11.904 "state": "online", 00:19:11.904 "raid_level": "raid1", 00:19:11.904 "superblock": true, 00:19:11.904 "num_base_bdevs": 3, 00:19:11.904 "num_base_bdevs_discovered": 2, 00:19:11.904 "num_base_bdevs_operational": 2, 00:19:11.904 "base_bdevs_list": [ 00:19:11.904 { 00:19:11.904 "name": null, 00:19:11.904 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.904 "is_configured": false, 00:19:11.904 "data_offset": 2048, 00:19:11.904 "data_size": 63488 00:19:11.904 }, 00:19:11.904 { 00:19:11.904 "name": "BaseBdev2", 00:19:11.904 "uuid": "a876fb6a-8554-560a-a01b-60815699e8ba", 00:19:11.904 "is_configured": true, 00:19:11.904 "data_offset": 2048, 00:19:11.904 "data_size": 63488 00:19:11.904 }, 00:19:11.904 { 00:19:11.904 "name": "BaseBdev3", 00:19:11.904 "uuid": "bc68524c-4906-5419-9c81-93f385efdb17", 00:19:11.904 "is_configured": true, 00:19:11.904 "data_offset": 2048, 00:19:11.904 "data_size": 63488 00:19:11.904 } 00:19:11.904 ] 00:19:11.904 }' 00:19:11.904 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.904 12:00:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.471 12:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:12.730 [2024-07-15 12:00:26.086593] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:12.730 [2024-07-15 12:00:26.086624] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:12.730 [2024-07-15 12:00:26.089853] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:12.730 [2024-07-15 12:00:26.089883] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:12.730 [2024-07-15 12:00:26.089957] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:12.730 [2024-07-15 12:00:26.089969] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27fb4b0 name raid_bdev1, state offline 00:19:12.730 0 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1516582 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1516582 ']' 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1516582 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1516582 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1516582' 00:19:12.730 killing process with pid 1516582 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1516582 00:19:12.730 [2024-07-15 12:00:26.159067] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:12.730 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1516582 00:19:12.730 [2024-07-15 12:00:26.179676] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.j0ErnAATqo 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:19:12.989 00:19:12.989 real 0m6.930s 00:19:12.989 user 0m11.022s 00:19:12.989 sys 0m1.214s 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:12.989 12:00:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.989 ************************************ 00:19:12.989 END TEST raid_write_error_test 00:19:12.989 ************************************ 00:19:12.989 12:00:26 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:12.989 12:00:26 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:19:12.989 12:00:26 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:12.989 12:00:26 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:19:12.989 12:00:26 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:12.989 12:00:26 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:12.989 12:00:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:12.989 ************************************ 00:19:12.989 START TEST raid_state_function_test 00:19:12.989 ************************************ 00:19:12.989 12:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 false 00:19:12.989 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:19:12.989 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1517580 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1517580' 00:19:12.990 Process raid pid: 1517580 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1517580 /var/tmp/spdk-raid.sock 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1517580 ']' 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:12.990 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:12.990 12:00:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:12.990 [2024-07-15 12:00:26.558832] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:19:12.990 [2024-07-15 12:00:26.558914] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:13.249 [2024-07-15 12:00:26.706049] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:13.249 [2024-07-15 12:00:26.819601] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:13.508 [2024-07-15 12:00:26.884146] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:13.508 [2024-07-15 12:00:26.884175] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:13.508 12:00:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:13.508 12:00:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:19:13.508 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:13.767 [2024-07-15 12:00:27.249364] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:13.767 [2024-07-15 12:00:27.249407] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:13.767 [2024-07-15 12:00:27.249418] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:13.767 [2024-07-15 12:00:27.249430] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:13.767 [2024-07-15 12:00:27.249438] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:13.767 [2024-07-15 12:00:27.249449] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:13.767 [2024-07-15 12:00:27.249461] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:13.767 [2024-07-15 12:00:27.249472] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:13.767 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:14.026 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:14.026 "name": "Existed_Raid", 00:19:14.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.026 "strip_size_kb": 64, 00:19:14.026 "state": "configuring", 00:19:14.026 "raid_level": "raid0", 00:19:14.026 "superblock": false, 00:19:14.026 "num_base_bdevs": 4, 00:19:14.026 "num_base_bdevs_discovered": 0, 00:19:14.026 "num_base_bdevs_operational": 4, 00:19:14.026 "base_bdevs_list": [ 00:19:14.026 { 00:19:14.026 "name": "BaseBdev1", 00:19:14.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.026 "is_configured": false, 00:19:14.026 "data_offset": 0, 00:19:14.026 "data_size": 0 00:19:14.026 }, 00:19:14.026 { 00:19:14.026 "name": "BaseBdev2", 00:19:14.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.026 "is_configured": false, 00:19:14.026 "data_offset": 0, 00:19:14.026 "data_size": 0 00:19:14.026 }, 00:19:14.026 { 00:19:14.026 "name": "BaseBdev3", 00:19:14.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.026 "is_configured": false, 00:19:14.026 "data_offset": 0, 00:19:14.026 "data_size": 0 00:19:14.026 }, 00:19:14.026 { 00:19:14.026 "name": "BaseBdev4", 00:19:14.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:14.026 "is_configured": false, 00:19:14.026 "data_offset": 0, 00:19:14.026 "data_size": 0 00:19:14.026 } 00:19:14.026 ] 00:19:14.026 }' 00:19:14.026 12:00:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:14.026 12:00:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:14.962 12:00:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:14.962 [2024-07-15 12:00:28.452399] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:14.962 [2024-07-15 12:00:28.452430] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20e5b20 name Existed_Raid, state configuring 00:19:14.962 12:00:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:15.221 [2024-07-15 12:00:28.693063] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:15.221 [2024-07-15 12:00:28.693101] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:15.221 [2024-07-15 12:00:28.693111] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:15.221 [2024-07-15 12:00:28.693122] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:15.221 [2024-07-15 12:00:28.693131] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:15.221 [2024-07-15 12:00:28.693142] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:15.221 [2024-07-15 12:00:28.693151] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:15.221 [2024-07-15 12:00:28.693169] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:15.221 12:00:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:15.479 [2024-07-15 12:00:28.944785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:15.479 BaseBdev1 00:19:15.479 12:00:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:15.479 12:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:15.479 12:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:15.479 12:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:15.479 12:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:15.479 12:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:15.479 12:00:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:15.737 12:00:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:15.997 [ 00:19:15.997 { 00:19:15.997 "name": "BaseBdev1", 00:19:15.997 "aliases": [ 00:19:15.997 "d5ca824f-ba86-41c7-a067-570e741e2fe9" 00:19:15.997 ], 00:19:15.997 "product_name": "Malloc disk", 00:19:15.997 "block_size": 512, 00:19:15.997 "num_blocks": 65536, 00:19:15.997 "uuid": "d5ca824f-ba86-41c7-a067-570e741e2fe9", 00:19:15.997 "assigned_rate_limits": { 00:19:15.997 "rw_ios_per_sec": 0, 00:19:15.997 "rw_mbytes_per_sec": 0, 00:19:15.997 "r_mbytes_per_sec": 0, 00:19:15.997 "w_mbytes_per_sec": 0 00:19:15.997 }, 00:19:15.997 "claimed": true, 00:19:15.997 "claim_type": "exclusive_write", 00:19:15.997 "zoned": false, 00:19:15.997 "supported_io_types": { 00:19:15.997 "read": true, 00:19:15.997 "write": true, 00:19:15.997 "unmap": true, 00:19:15.997 "flush": true, 00:19:15.997 "reset": true, 00:19:15.997 "nvme_admin": false, 00:19:15.997 "nvme_io": false, 00:19:15.997 "nvme_io_md": false, 00:19:15.997 "write_zeroes": true, 00:19:15.997 "zcopy": true, 00:19:15.997 "get_zone_info": false, 00:19:15.997 "zone_management": false, 00:19:15.997 "zone_append": false, 00:19:15.997 "compare": false, 00:19:15.997 "compare_and_write": false, 00:19:15.997 "abort": true, 00:19:15.997 "seek_hole": false, 00:19:15.997 "seek_data": false, 00:19:15.997 "copy": true, 00:19:15.997 "nvme_iov_md": false 00:19:15.997 }, 00:19:15.997 "memory_domains": [ 00:19:15.997 { 00:19:15.997 "dma_device_id": "system", 00:19:15.997 "dma_device_type": 1 00:19:15.997 }, 00:19:15.997 { 00:19:15.997 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.997 "dma_device_type": 2 00:19:15.997 } 00:19:15.997 ], 00:19:15.997 "driver_specific": {} 00:19:15.997 } 00:19:15.997 ] 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.997 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.256 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.256 "name": "Existed_Raid", 00:19:16.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.256 "strip_size_kb": 64, 00:19:16.256 "state": "configuring", 00:19:16.256 "raid_level": "raid0", 00:19:16.256 "superblock": false, 00:19:16.256 "num_base_bdevs": 4, 00:19:16.256 "num_base_bdevs_discovered": 1, 00:19:16.256 "num_base_bdevs_operational": 4, 00:19:16.256 "base_bdevs_list": [ 00:19:16.256 { 00:19:16.256 "name": "BaseBdev1", 00:19:16.256 "uuid": "d5ca824f-ba86-41c7-a067-570e741e2fe9", 00:19:16.256 "is_configured": true, 00:19:16.256 "data_offset": 0, 00:19:16.256 "data_size": 65536 00:19:16.256 }, 00:19:16.256 { 00:19:16.256 "name": "BaseBdev2", 00:19:16.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.256 "is_configured": false, 00:19:16.256 "data_offset": 0, 00:19:16.256 "data_size": 0 00:19:16.256 }, 00:19:16.256 { 00:19:16.256 "name": "BaseBdev3", 00:19:16.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.256 "is_configured": false, 00:19:16.256 "data_offset": 0, 00:19:16.256 "data_size": 0 00:19:16.256 }, 00:19:16.256 { 00:19:16.256 "name": "BaseBdev4", 00:19:16.256 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:16.256 "is_configured": false, 00:19:16.256 "data_offset": 0, 00:19:16.256 "data_size": 0 00:19:16.256 } 00:19:16.256 ] 00:19:16.256 }' 00:19:16.256 12:00:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.256 12:00:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.824 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:17.083 [2024-07-15 12:00:30.573097] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:17.083 [2024-07-15 12:00:30.573141] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20e5390 name Existed_Raid, state configuring 00:19:17.083 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:17.342 [2024-07-15 12:00:30.817778] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:17.342 [2024-07-15 12:00:30.819266] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:17.342 [2024-07-15 12:00:30.819300] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:17.342 [2024-07-15 12:00:30.819310] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:17.342 [2024-07-15 12:00:30.819322] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:17.342 [2024-07-15 12:00:30.819330] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:17.342 [2024-07-15 12:00:30.819341] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.342 12:00:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:17.602 12:00:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:17.602 "name": "Existed_Raid", 00:19:17.602 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.602 "strip_size_kb": 64, 00:19:17.602 "state": "configuring", 00:19:17.602 "raid_level": "raid0", 00:19:17.602 "superblock": false, 00:19:17.602 "num_base_bdevs": 4, 00:19:17.602 "num_base_bdevs_discovered": 1, 00:19:17.602 "num_base_bdevs_operational": 4, 00:19:17.602 "base_bdevs_list": [ 00:19:17.602 { 00:19:17.602 "name": "BaseBdev1", 00:19:17.602 "uuid": "d5ca824f-ba86-41c7-a067-570e741e2fe9", 00:19:17.602 "is_configured": true, 00:19:17.602 "data_offset": 0, 00:19:17.602 "data_size": 65536 00:19:17.602 }, 00:19:17.602 { 00:19:17.602 "name": "BaseBdev2", 00:19:17.602 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.602 "is_configured": false, 00:19:17.602 "data_offset": 0, 00:19:17.602 "data_size": 0 00:19:17.602 }, 00:19:17.602 { 00:19:17.602 "name": "BaseBdev3", 00:19:17.602 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.602 "is_configured": false, 00:19:17.602 "data_offset": 0, 00:19:17.602 "data_size": 0 00:19:17.602 }, 00:19:17.602 { 00:19:17.602 "name": "BaseBdev4", 00:19:17.602 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:17.602 "is_configured": false, 00:19:17.602 "data_offset": 0, 00:19:17.602 "data_size": 0 00:19:17.602 } 00:19:17.602 ] 00:19:17.602 }' 00:19:17.602 12:00:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:17.602 12:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:18.168 12:00:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:18.425 [2024-07-15 12:00:31.912144] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:18.425 BaseBdev2 00:19:18.425 12:00:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:18.425 12:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:18.425 12:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:18.425 12:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:18.425 12:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:18.425 12:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:18.426 12:00:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.684 12:00:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:18.943 [ 00:19:18.943 { 00:19:18.943 "name": "BaseBdev2", 00:19:18.943 "aliases": [ 00:19:18.943 "b6eeede2-0a6d-4f80-801c-a57d6f418afa" 00:19:18.943 ], 00:19:18.943 "product_name": "Malloc disk", 00:19:18.943 "block_size": 512, 00:19:18.943 "num_blocks": 65536, 00:19:18.943 "uuid": "b6eeede2-0a6d-4f80-801c-a57d6f418afa", 00:19:18.943 "assigned_rate_limits": { 00:19:18.943 "rw_ios_per_sec": 0, 00:19:18.943 "rw_mbytes_per_sec": 0, 00:19:18.943 "r_mbytes_per_sec": 0, 00:19:18.943 "w_mbytes_per_sec": 0 00:19:18.943 }, 00:19:18.943 "claimed": true, 00:19:18.943 "claim_type": "exclusive_write", 00:19:18.943 "zoned": false, 00:19:18.943 "supported_io_types": { 00:19:18.943 "read": true, 00:19:18.943 "write": true, 00:19:18.943 "unmap": true, 00:19:18.943 "flush": true, 00:19:18.943 "reset": true, 00:19:18.943 "nvme_admin": false, 00:19:18.943 "nvme_io": false, 00:19:18.943 "nvme_io_md": false, 00:19:18.943 "write_zeroes": true, 00:19:18.943 "zcopy": true, 00:19:18.943 "get_zone_info": false, 00:19:18.943 "zone_management": false, 00:19:18.943 "zone_append": false, 00:19:18.943 "compare": false, 00:19:18.943 "compare_and_write": false, 00:19:18.943 "abort": true, 00:19:18.943 "seek_hole": false, 00:19:18.943 "seek_data": false, 00:19:18.943 "copy": true, 00:19:18.943 "nvme_iov_md": false 00:19:18.943 }, 00:19:18.943 "memory_domains": [ 00:19:18.943 { 00:19:18.943 "dma_device_id": "system", 00:19:18.943 "dma_device_type": 1 00:19:18.943 }, 00:19:18.943 { 00:19:18.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.943 "dma_device_type": 2 00:19:18.943 } 00:19:18.943 ], 00:19:18.943 "driver_specific": {} 00:19:18.943 } 00:19:18.943 ] 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:18.943 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:18.944 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:18.944 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:18.944 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:18.944 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.203 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.203 "name": "Existed_Raid", 00:19:19.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.203 "strip_size_kb": 64, 00:19:19.203 "state": "configuring", 00:19:19.203 "raid_level": "raid0", 00:19:19.203 "superblock": false, 00:19:19.203 "num_base_bdevs": 4, 00:19:19.203 "num_base_bdevs_discovered": 2, 00:19:19.203 "num_base_bdevs_operational": 4, 00:19:19.203 "base_bdevs_list": [ 00:19:19.203 { 00:19:19.203 "name": "BaseBdev1", 00:19:19.203 "uuid": "d5ca824f-ba86-41c7-a067-570e741e2fe9", 00:19:19.203 "is_configured": true, 00:19:19.203 "data_offset": 0, 00:19:19.203 "data_size": 65536 00:19:19.203 }, 00:19:19.203 { 00:19:19.203 "name": "BaseBdev2", 00:19:19.203 "uuid": "b6eeede2-0a6d-4f80-801c-a57d6f418afa", 00:19:19.203 "is_configured": true, 00:19:19.203 "data_offset": 0, 00:19:19.203 "data_size": 65536 00:19:19.203 }, 00:19:19.203 { 00:19:19.203 "name": "BaseBdev3", 00:19:19.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.203 "is_configured": false, 00:19:19.203 "data_offset": 0, 00:19:19.203 "data_size": 0 00:19:19.203 }, 00:19:19.203 { 00:19:19.203 "name": "BaseBdev4", 00:19:19.203 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.203 "is_configured": false, 00:19:19.203 "data_offset": 0, 00:19:19.203 "data_size": 0 00:19:19.203 } 00:19:19.203 ] 00:19:19.203 }' 00:19:19.203 12:00:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.203 12:00:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:19.771 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:20.029 [2024-07-15 12:00:33.403432] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:20.029 BaseBdev3 00:19:20.029 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:20.029 12:00:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:20.029 12:00:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:20.029 12:00:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:20.029 12:00:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:20.029 12:00:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:20.029 12:00:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:20.287 12:00:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:20.287 [ 00:19:20.287 { 00:19:20.287 "name": "BaseBdev3", 00:19:20.287 "aliases": [ 00:19:20.287 "996c5bbd-0a1a-43fa-aeba-7be74100d8b8" 00:19:20.287 ], 00:19:20.287 "product_name": "Malloc disk", 00:19:20.287 "block_size": 512, 00:19:20.287 "num_blocks": 65536, 00:19:20.287 "uuid": "996c5bbd-0a1a-43fa-aeba-7be74100d8b8", 00:19:20.287 "assigned_rate_limits": { 00:19:20.287 "rw_ios_per_sec": 0, 00:19:20.287 "rw_mbytes_per_sec": 0, 00:19:20.287 "r_mbytes_per_sec": 0, 00:19:20.287 "w_mbytes_per_sec": 0 00:19:20.287 }, 00:19:20.287 "claimed": true, 00:19:20.287 "claim_type": "exclusive_write", 00:19:20.287 "zoned": false, 00:19:20.287 "supported_io_types": { 00:19:20.287 "read": true, 00:19:20.287 "write": true, 00:19:20.287 "unmap": true, 00:19:20.287 "flush": true, 00:19:20.287 "reset": true, 00:19:20.287 "nvme_admin": false, 00:19:20.287 "nvme_io": false, 00:19:20.287 "nvme_io_md": false, 00:19:20.287 "write_zeroes": true, 00:19:20.287 "zcopy": true, 00:19:20.287 "get_zone_info": false, 00:19:20.287 "zone_management": false, 00:19:20.287 "zone_append": false, 00:19:20.287 "compare": false, 00:19:20.287 "compare_and_write": false, 00:19:20.287 "abort": true, 00:19:20.287 "seek_hole": false, 00:19:20.287 "seek_data": false, 00:19:20.287 "copy": true, 00:19:20.287 "nvme_iov_md": false 00:19:20.287 }, 00:19:20.287 "memory_domains": [ 00:19:20.287 { 00:19:20.287 "dma_device_id": "system", 00:19:20.287 "dma_device_type": 1 00:19:20.287 }, 00:19:20.287 { 00:19:20.287 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.287 "dma_device_type": 2 00:19:20.287 } 00:19:20.287 ], 00:19:20.287 "driver_specific": {} 00:19:20.287 } 00:19:20.287 ] 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.546 12:00:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.546 12:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.546 "name": "Existed_Raid", 00:19:20.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.546 "strip_size_kb": 64, 00:19:20.546 "state": "configuring", 00:19:20.546 "raid_level": "raid0", 00:19:20.546 "superblock": false, 00:19:20.546 "num_base_bdevs": 4, 00:19:20.546 "num_base_bdevs_discovered": 3, 00:19:20.546 "num_base_bdevs_operational": 4, 00:19:20.546 "base_bdevs_list": [ 00:19:20.546 { 00:19:20.546 "name": "BaseBdev1", 00:19:20.546 "uuid": "d5ca824f-ba86-41c7-a067-570e741e2fe9", 00:19:20.546 "is_configured": true, 00:19:20.546 "data_offset": 0, 00:19:20.546 "data_size": 65536 00:19:20.546 }, 00:19:20.546 { 00:19:20.546 "name": "BaseBdev2", 00:19:20.546 "uuid": "b6eeede2-0a6d-4f80-801c-a57d6f418afa", 00:19:20.546 "is_configured": true, 00:19:20.546 "data_offset": 0, 00:19:20.546 "data_size": 65536 00:19:20.546 }, 00:19:20.546 { 00:19:20.546 "name": "BaseBdev3", 00:19:20.546 "uuid": "996c5bbd-0a1a-43fa-aeba-7be74100d8b8", 00:19:20.546 "is_configured": true, 00:19:20.546 "data_offset": 0, 00:19:20.546 "data_size": 65536 00:19:20.546 }, 00:19:20.546 { 00:19:20.546 "name": "BaseBdev4", 00:19:20.546 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.546 "is_configured": false, 00:19:20.546 "data_offset": 0, 00:19:20.546 "data_size": 0 00:19:20.546 } 00:19:20.546 ] 00:19:20.546 }' 00:19:20.546 12:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.546 12:00:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:21.483 12:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:21.483 [2024-07-15 12:00:34.979024] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:21.483 [2024-07-15 12:00:34.979059] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20e64a0 00:19:21.483 [2024-07-15 12:00:34.979068] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:21.483 [2024-07-15 12:00:34.979320] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e60a0 00:19:21.483 [2024-07-15 12:00:34.979441] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20e64a0 00:19:21.483 [2024-07-15 12:00:34.979450] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20e64a0 00:19:21.483 [2024-07-15 12:00:34.979614] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:21.483 BaseBdev4 00:19:21.483 12:00:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:21.483 12:00:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:21.483 12:00:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:21.483 12:00:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:21.483 12:00:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:21.483 12:00:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:21.483 12:00:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:21.742 12:00:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:22.001 [ 00:19:22.001 { 00:19:22.001 "name": "BaseBdev4", 00:19:22.001 "aliases": [ 00:19:22.001 "d3a8d8a8-a11c-4ef1-9fce-da32607f204a" 00:19:22.001 ], 00:19:22.001 "product_name": "Malloc disk", 00:19:22.001 "block_size": 512, 00:19:22.001 "num_blocks": 65536, 00:19:22.001 "uuid": "d3a8d8a8-a11c-4ef1-9fce-da32607f204a", 00:19:22.001 "assigned_rate_limits": { 00:19:22.001 "rw_ios_per_sec": 0, 00:19:22.001 "rw_mbytes_per_sec": 0, 00:19:22.001 "r_mbytes_per_sec": 0, 00:19:22.001 "w_mbytes_per_sec": 0 00:19:22.001 }, 00:19:22.001 "claimed": true, 00:19:22.001 "claim_type": "exclusive_write", 00:19:22.001 "zoned": false, 00:19:22.001 "supported_io_types": { 00:19:22.001 "read": true, 00:19:22.001 "write": true, 00:19:22.001 "unmap": true, 00:19:22.001 "flush": true, 00:19:22.001 "reset": true, 00:19:22.001 "nvme_admin": false, 00:19:22.001 "nvme_io": false, 00:19:22.001 "nvme_io_md": false, 00:19:22.001 "write_zeroes": true, 00:19:22.001 "zcopy": true, 00:19:22.001 "get_zone_info": false, 00:19:22.001 "zone_management": false, 00:19:22.001 "zone_append": false, 00:19:22.001 "compare": false, 00:19:22.001 "compare_and_write": false, 00:19:22.001 "abort": true, 00:19:22.001 "seek_hole": false, 00:19:22.001 "seek_data": false, 00:19:22.001 "copy": true, 00:19:22.001 "nvme_iov_md": false 00:19:22.001 }, 00:19:22.001 "memory_domains": [ 00:19:22.001 { 00:19:22.001 "dma_device_id": "system", 00:19:22.001 "dma_device_type": 1 00:19:22.001 }, 00:19:22.001 { 00:19:22.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.001 "dma_device_type": 2 00:19:22.001 } 00:19:22.001 ], 00:19:22.001 "driver_specific": {} 00:19:22.001 } 00:19:22.001 ] 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.001 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.262 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.262 "name": "Existed_Raid", 00:19:22.262 "uuid": "32adb7f5-7279-40f0-8720-6e92e155012f", 00:19:22.262 "strip_size_kb": 64, 00:19:22.262 "state": "online", 00:19:22.262 "raid_level": "raid0", 00:19:22.262 "superblock": false, 00:19:22.262 "num_base_bdevs": 4, 00:19:22.262 "num_base_bdevs_discovered": 4, 00:19:22.262 "num_base_bdevs_operational": 4, 00:19:22.262 "base_bdevs_list": [ 00:19:22.262 { 00:19:22.262 "name": "BaseBdev1", 00:19:22.262 "uuid": "d5ca824f-ba86-41c7-a067-570e741e2fe9", 00:19:22.262 "is_configured": true, 00:19:22.262 "data_offset": 0, 00:19:22.262 "data_size": 65536 00:19:22.262 }, 00:19:22.262 { 00:19:22.262 "name": "BaseBdev2", 00:19:22.262 "uuid": "b6eeede2-0a6d-4f80-801c-a57d6f418afa", 00:19:22.262 "is_configured": true, 00:19:22.262 "data_offset": 0, 00:19:22.262 "data_size": 65536 00:19:22.262 }, 00:19:22.262 { 00:19:22.262 "name": "BaseBdev3", 00:19:22.262 "uuid": "996c5bbd-0a1a-43fa-aeba-7be74100d8b8", 00:19:22.262 "is_configured": true, 00:19:22.262 "data_offset": 0, 00:19:22.262 "data_size": 65536 00:19:22.262 }, 00:19:22.262 { 00:19:22.262 "name": "BaseBdev4", 00:19:22.262 "uuid": "d3a8d8a8-a11c-4ef1-9fce-da32607f204a", 00:19:22.262 "is_configured": true, 00:19:22.262 "data_offset": 0, 00:19:22.262 "data_size": 65536 00:19:22.262 } 00:19:22.262 ] 00:19:22.262 }' 00:19:22.262 12:00:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.262 12:00:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:22.830 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:22.830 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:22.830 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:22.830 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:22.830 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:22.830 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:22.830 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:22.830 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:23.090 [2024-07-15 12:00:36.571579] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:23.090 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:23.090 "name": "Existed_Raid", 00:19:23.090 "aliases": [ 00:19:23.090 "32adb7f5-7279-40f0-8720-6e92e155012f" 00:19:23.090 ], 00:19:23.090 "product_name": "Raid Volume", 00:19:23.090 "block_size": 512, 00:19:23.090 "num_blocks": 262144, 00:19:23.090 "uuid": "32adb7f5-7279-40f0-8720-6e92e155012f", 00:19:23.090 "assigned_rate_limits": { 00:19:23.090 "rw_ios_per_sec": 0, 00:19:23.090 "rw_mbytes_per_sec": 0, 00:19:23.090 "r_mbytes_per_sec": 0, 00:19:23.090 "w_mbytes_per_sec": 0 00:19:23.090 }, 00:19:23.090 "claimed": false, 00:19:23.090 "zoned": false, 00:19:23.090 "supported_io_types": { 00:19:23.090 "read": true, 00:19:23.090 "write": true, 00:19:23.090 "unmap": true, 00:19:23.090 "flush": true, 00:19:23.090 "reset": true, 00:19:23.090 "nvme_admin": false, 00:19:23.090 "nvme_io": false, 00:19:23.090 "nvme_io_md": false, 00:19:23.090 "write_zeroes": true, 00:19:23.090 "zcopy": false, 00:19:23.090 "get_zone_info": false, 00:19:23.090 "zone_management": false, 00:19:23.090 "zone_append": false, 00:19:23.090 "compare": false, 00:19:23.090 "compare_and_write": false, 00:19:23.090 "abort": false, 00:19:23.090 "seek_hole": false, 00:19:23.090 "seek_data": false, 00:19:23.090 "copy": false, 00:19:23.090 "nvme_iov_md": false 00:19:23.090 }, 00:19:23.090 "memory_domains": [ 00:19:23.090 { 00:19:23.090 "dma_device_id": "system", 00:19:23.090 "dma_device_type": 1 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.090 "dma_device_type": 2 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "dma_device_id": "system", 00:19:23.090 "dma_device_type": 1 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.090 "dma_device_type": 2 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "dma_device_id": "system", 00:19:23.090 "dma_device_type": 1 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.090 "dma_device_type": 2 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "dma_device_id": "system", 00:19:23.090 "dma_device_type": 1 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.090 "dma_device_type": 2 00:19:23.090 } 00:19:23.090 ], 00:19:23.090 "driver_specific": { 00:19:23.090 "raid": { 00:19:23.090 "uuid": "32adb7f5-7279-40f0-8720-6e92e155012f", 00:19:23.090 "strip_size_kb": 64, 00:19:23.090 "state": "online", 00:19:23.090 "raid_level": "raid0", 00:19:23.090 "superblock": false, 00:19:23.090 "num_base_bdevs": 4, 00:19:23.090 "num_base_bdevs_discovered": 4, 00:19:23.090 "num_base_bdevs_operational": 4, 00:19:23.090 "base_bdevs_list": [ 00:19:23.090 { 00:19:23.090 "name": "BaseBdev1", 00:19:23.090 "uuid": "d5ca824f-ba86-41c7-a067-570e741e2fe9", 00:19:23.090 "is_configured": true, 00:19:23.090 "data_offset": 0, 00:19:23.090 "data_size": 65536 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "name": "BaseBdev2", 00:19:23.090 "uuid": "b6eeede2-0a6d-4f80-801c-a57d6f418afa", 00:19:23.090 "is_configured": true, 00:19:23.090 "data_offset": 0, 00:19:23.090 "data_size": 65536 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "name": "BaseBdev3", 00:19:23.090 "uuid": "996c5bbd-0a1a-43fa-aeba-7be74100d8b8", 00:19:23.090 "is_configured": true, 00:19:23.090 "data_offset": 0, 00:19:23.090 "data_size": 65536 00:19:23.090 }, 00:19:23.090 { 00:19:23.090 "name": "BaseBdev4", 00:19:23.090 "uuid": "d3a8d8a8-a11c-4ef1-9fce-da32607f204a", 00:19:23.090 "is_configured": true, 00:19:23.090 "data_offset": 0, 00:19:23.090 "data_size": 65536 00:19:23.090 } 00:19:23.090 ] 00:19:23.090 } 00:19:23.090 } 00:19:23.090 }' 00:19:23.090 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:23.090 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:23.090 BaseBdev2 00:19:23.090 BaseBdev3 00:19:23.090 BaseBdev4' 00:19:23.090 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:23.090 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:23.090 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:23.350 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:23.350 "name": "BaseBdev1", 00:19:23.350 "aliases": [ 00:19:23.350 "d5ca824f-ba86-41c7-a067-570e741e2fe9" 00:19:23.350 ], 00:19:23.350 "product_name": "Malloc disk", 00:19:23.350 "block_size": 512, 00:19:23.350 "num_blocks": 65536, 00:19:23.350 "uuid": "d5ca824f-ba86-41c7-a067-570e741e2fe9", 00:19:23.350 "assigned_rate_limits": { 00:19:23.350 "rw_ios_per_sec": 0, 00:19:23.350 "rw_mbytes_per_sec": 0, 00:19:23.350 "r_mbytes_per_sec": 0, 00:19:23.350 "w_mbytes_per_sec": 0 00:19:23.350 }, 00:19:23.350 "claimed": true, 00:19:23.350 "claim_type": "exclusive_write", 00:19:23.350 "zoned": false, 00:19:23.350 "supported_io_types": { 00:19:23.350 "read": true, 00:19:23.350 "write": true, 00:19:23.350 "unmap": true, 00:19:23.350 "flush": true, 00:19:23.350 "reset": true, 00:19:23.350 "nvme_admin": false, 00:19:23.350 "nvme_io": false, 00:19:23.350 "nvme_io_md": false, 00:19:23.350 "write_zeroes": true, 00:19:23.350 "zcopy": true, 00:19:23.350 "get_zone_info": false, 00:19:23.350 "zone_management": false, 00:19:23.350 "zone_append": false, 00:19:23.350 "compare": false, 00:19:23.350 "compare_and_write": false, 00:19:23.350 "abort": true, 00:19:23.350 "seek_hole": false, 00:19:23.350 "seek_data": false, 00:19:23.350 "copy": true, 00:19:23.350 "nvme_iov_md": false 00:19:23.350 }, 00:19:23.350 "memory_domains": [ 00:19:23.350 { 00:19:23.350 "dma_device_id": "system", 00:19:23.350 "dma_device_type": 1 00:19:23.350 }, 00:19:23.350 { 00:19:23.350 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:23.350 "dma_device_type": 2 00:19:23.350 } 00:19:23.350 ], 00:19:23.350 "driver_specific": {} 00:19:23.350 }' 00:19:23.350 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.350 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:23.608 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:23.608 12:00:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.608 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:23.608 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:23.608 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.608 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:23.608 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:23.608 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.608 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:23.866 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:23.867 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:23.867 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:23.867 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.125 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:24.125 "name": "BaseBdev2", 00:19:24.125 "aliases": [ 00:19:24.125 "b6eeede2-0a6d-4f80-801c-a57d6f418afa" 00:19:24.125 ], 00:19:24.125 "product_name": "Malloc disk", 00:19:24.125 "block_size": 512, 00:19:24.125 "num_blocks": 65536, 00:19:24.125 "uuid": "b6eeede2-0a6d-4f80-801c-a57d6f418afa", 00:19:24.125 "assigned_rate_limits": { 00:19:24.125 "rw_ios_per_sec": 0, 00:19:24.125 "rw_mbytes_per_sec": 0, 00:19:24.125 "r_mbytes_per_sec": 0, 00:19:24.125 "w_mbytes_per_sec": 0 00:19:24.125 }, 00:19:24.125 "claimed": true, 00:19:24.125 "claim_type": "exclusive_write", 00:19:24.125 "zoned": false, 00:19:24.125 "supported_io_types": { 00:19:24.125 "read": true, 00:19:24.125 "write": true, 00:19:24.125 "unmap": true, 00:19:24.125 "flush": true, 00:19:24.125 "reset": true, 00:19:24.125 "nvme_admin": false, 00:19:24.125 "nvme_io": false, 00:19:24.125 "nvme_io_md": false, 00:19:24.125 "write_zeroes": true, 00:19:24.125 "zcopy": true, 00:19:24.125 "get_zone_info": false, 00:19:24.125 "zone_management": false, 00:19:24.125 "zone_append": false, 00:19:24.125 "compare": false, 00:19:24.125 "compare_and_write": false, 00:19:24.125 "abort": true, 00:19:24.125 "seek_hole": false, 00:19:24.125 "seek_data": false, 00:19:24.125 "copy": true, 00:19:24.125 "nvme_iov_md": false 00:19:24.125 }, 00:19:24.125 "memory_domains": [ 00:19:24.125 { 00:19:24.125 "dma_device_id": "system", 00:19:24.125 "dma_device_type": 1 00:19:24.125 }, 00:19:24.125 { 00:19:24.125 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.125 "dma_device_type": 2 00:19:24.125 } 00:19:24.125 ], 00:19:24.125 "driver_specific": {} 00:19:24.125 }' 00:19:24.125 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.125 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.125 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:24.125 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.125 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.125 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:24.125 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.125 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.383 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.383 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.383 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.383 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.383 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:24.383 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:24.383 12:00:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:24.642 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:24.642 "name": "BaseBdev3", 00:19:24.642 "aliases": [ 00:19:24.642 "996c5bbd-0a1a-43fa-aeba-7be74100d8b8" 00:19:24.642 ], 00:19:24.642 "product_name": "Malloc disk", 00:19:24.642 "block_size": 512, 00:19:24.642 "num_blocks": 65536, 00:19:24.642 "uuid": "996c5bbd-0a1a-43fa-aeba-7be74100d8b8", 00:19:24.642 "assigned_rate_limits": { 00:19:24.642 "rw_ios_per_sec": 0, 00:19:24.642 "rw_mbytes_per_sec": 0, 00:19:24.642 "r_mbytes_per_sec": 0, 00:19:24.642 "w_mbytes_per_sec": 0 00:19:24.642 }, 00:19:24.642 "claimed": true, 00:19:24.642 "claim_type": "exclusive_write", 00:19:24.642 "zoned": false, 00:19:24.642 "supported_io_types": { 00:19:24.642 "read": true, 00:19:24.642 "write": true, 00:19:24.642 "unmap": true, 00:19:24.642 "flush": true, 00:19:24.642 "reset": true, 00:19:24.642 "nvme_admin": false, 00:19:24.642 "nvme_io": false, 00:19:24.642 "nvme_io_md": false, 00:19:24.642 "write_zeroes": true, 00:19:24.642 "zcopy": true, 00:19:24.642 "get_zone_info": false, 00:19:24.642 "zone_management": false, 00:19:24.642 "zone_append": false, 00:19:24.642 "compare": false, 00:19:24.642 "compare_and_write": false, 00:19:24.642 "abort": true, 00:19:24.642 "seek_hole": false, 00:19:24.642 "seek_data": false, 00:19:24.642 "copy": true, 00:19:24.642 "nvme_iov_md": false 00:19:24.642 }, 00:19:24.642 "memory_domains": [ 00:19:24.642 { 00:19:24.642 "dma_device_id": "system", 00:19:24.642 "dma_device_type": 1 00:19:24.642 }, 00:19:24.642 { 00:19:24.642 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:24.642 "dma_device_type": 2 00:19:24.642 } 00:19:24.642 ], 00:19:24.642 "driver_specific": {} 00:19:24.642 }' 00:19:24.642 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.642 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:24.642 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:24.642 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.642 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:24.900 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:25.158 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:25.158 "name": "BaseBdev4", 00:19:25.158 "aliases": [ 00:19:25.158 "d3a8d8a8-a11c-4ef1-9fce-da32607f204a" 00:19:25.158 ], 00:19:25.158 "product_name": "Malloc disk", 00:19:25.158 "block_size": 512, 00:19:25.158 "num_blocks": 65536, 00:19:25.158 "uuid": "d3a8d8a8-a11c-4ef1-9fce-da32607f204a", 00:19:25.158 "assigned_rate_limits": { 00:19:25.158 "rw_ios_per_sec": 0, 00:19:25.158 "rw_mbytes_per_sec": 0, 00:19:25.158 "r_mbytes_per_sec": 0, 00:19:25.158 "w_mbytes_per_sec": 0 00:19:25.158 }, 00:19:25.158 "claimed": true, 00:19:25.158 "claim_type": "exclusive_write", 00:19:25.158 "zoned": false, 00:19:25.158 "supported_io_types": { 00:19:25.158 "read": true, 00:19:25.158 "write": true, 00:19:25.158 "unmap": true, 00:19:25.158 "flush": true, 00:19:25.158 "reset": true, 00:19:25.158 "nvme_admin": false, 00:19:25.158 "nvme_io": false, 00:19:25.158 "nvme_io_md": false, 00:19:25.158 "write_zeroes": true, 00:19:25.158 "zcopy": true, 00:19:25.158 "get_zone_info": false, 00:19:25.159 "zone_management": false, 00:19:25.159 "zone_append": false, 00:19:25.159 "compare": false, 00:19:25.159 "compare_and_write": false, 00:19:25.159 "abort": true, 00:19:25.159 "seek_hole": false, 00:19:25.159 "seek_data": false, 00:19:25.159 "copy": true, 00:19:25.159 "nvme_iov_md": false 00:19:25.159 }, 00:19:25.159 "memory_domains": [ 00:19:25.159 { 00:19:25.159 "dma_device_id": "system", 00:19:25.159 "dma_device_type": 1 00:19:25.159 }, 00:19:25.159 { 00:19:25.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:25.159 "dma_device_type": 2 00:19:25.159 } 00:19:25.159 ], 00:19:25.159 "driver_specific": {} 00:19:25.159 }' 00:19:25.159 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.159 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:25.417 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:25.417 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.417 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:25.417 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:25.417 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.417 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:25.417 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:25.417 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.417 12:00:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:25.677 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:25.677 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:25.677 [2024-07-15 12:00:39.246381] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:25.677 [2024-07-15 12:00:39.246408] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:25.677 [2024-07-15 12:00:39.246453] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:25.677 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:25.677 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:19:25.677 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:25.677 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:25.677 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:25.677 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.936 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.936 "name": "Existed_Raid", 00:19:25.936 "uuid": "32adb7f5-7279-40f0-8720-6e92e155012f", 00:19:25.936 "strip_size_kb": 64, 00:19:25.936 "state": "offline", 00:19:25.936 "raid_level": "raid0", 00:19:25.936 "superblock": false, 00:19:25.936 "num_base_bdevs": 4, 00:19:25.936 "num_base_bdevs_discovered": 3, 00:19:25.936 "num_base_bdevs_operational": 3, 00:19:25.936 "base_bdevs_list": [ 00:19:25.936 { 00:19:25.936 "name": null, 00:19:25.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.936 "is_configured": false, 00:19:25.936 "data_offset": 0, 00:19:25.936 "data_size": 65536 00:19:25.936 }, 00:19:25.936 { 00:19:25.936 "name": "BaseBdev2", 00:19:25.936 "uuid": "b6eeede2-0a6d-4f80-801c-a57d6f418afa", 00:19:25.936 "is_configured": true, 00:19:25.936 "data_offset": 0, 00:19:25.936 "data_size": 65536 00:19:25.936 }, 00:19:25.936 { 00:19:25.936 "name": "BaseBdev3", 00:19:25.936 "uuid": "996c5bbd-0a1a-43fa-aeba-7be74100d8b8", 00:19:25.936 "is_configured": true, 00:19:25.936 "data_offset": 0, 00:19:25.936 "data_size": 65536 00:19:25.936 }, 00:19:25.937 { 00:19:25.937 "name": "BaseBdev4", 00:19:25.937 "uuid": "d3a8d8a8-a11c-4ef1-9fce-da32607f204a", 00:19:25.937 "is_configured": true, 00:19:25.937 "data_offset": 0, 00:19:25.937 "data_size": 65536 00:19:25.937 } 00:19:25.937 ] 00:19:25.937 }' 00:19:25.937 12:00:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.937 12:00:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:26.502 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:26.502 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:26.502 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:26.502 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.784 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:26.784 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:26.784 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:27.043 [2024-07-15 12:00:40.530841] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:27.043 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:27.043 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:27.043 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.043 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:27.303 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:27.303 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:27.303 12:00:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:27.578 [2024-07-15 12:00:41.028110] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:27.578 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:27.578 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:27.578 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.578 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:27.891 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:27.891 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:27.891 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:28.168 [2024-07-15 12:00:41.527995] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:28.168 [2024-07-15 12:00:41.528039] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20e64a0 name Existed_Raid, state offline 00:19:28.168 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:28.168 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:28.168 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.168 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:28.426 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:28.426 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:28.426 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:28.426 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:28.426 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:28.426 12:00:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:28.685 BaseBdev2 00:19:28.685 12:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:28.685 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:28.685 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:28.685 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:28.685 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:28.685 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:28.685 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:28.956 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:29.218 [ 00:19:29.218 { 00:19:29.218 "name": "BaseBdev2", 00:19:29.218 "aliases": [ 00:19:29.218 "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0" 00:19:29.218 ], 00:19:29.218 "product_name": "Malloc disk", 00:19:29.218 "block_size": 512, 00:19:29.218 "num_blocks": 65536, 00:19:29.218 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:29.218 "assigned_rate_limits": { 00:19:29.218 "rw_ios_per_sec": 0, 00:19:29.218 "rw_mbytes_per_sec": 0, 00:19:29.218 "r_mbytes_per_sec": 0, 00:19:29.218 "w_mbytes_per_sec": 0 00:19:29.218 }, 00:19:29.218 "claimed": false, 00:19:29.218 "zoned": false, 00:19:29.218 "supported_io_types": { 00:19:29.218 "read": true, 00:19:29.218 "write": true, 00:19:29.218 "unmap": true, 00:19:29.218 "flush": true, 00:19:29.218 "reset": true, 00:19:29.218 "nvme_admin": false, 00:19:29.218 "nvme_io": false, 00:19:29.218 "nvme_io_md": false, 00:19:29.218 "write_zeroes": true, 00:19:29.218 "zcopy": true, 00:19:29.218 "get_zone_info": false, 00:19:29.218 "zone_management": false, 00:19:29.218 "zone_append": false, 00:19:29.218 "compare": false, 00:19:29.218 "compare_and_write": false, 00:19:29.218 "abort": true, 00:19:29.218 "seek_hole": false, 00:19:29.218 "seek_data": false, 00:19:29.218 "copy": true, 00:19:29.218 "nvme_iov_md": false 00:19:29.218 }, 00:19:29.218 "memory_domains": [ 00:19:29.218 { 00:19:29.218 "dma_device_id": "system", 00:19:29.218 "dma_device_type": 1 00:19:29.218 }, 00:19:29.218 { 00:19:29.218 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.218 "dma_device_type": 2 00:19:29.218 } 00:19:29.218 ], 00:19:29.218 "driver_specific": {} 00:19:29.218 } 00:19:29.218 ] 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:29.218 BaseBdev3 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:29.218 12:00:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:29.796 12:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:30.059 [ 00:19:30.059 { 00:19:30.059 "name": "BaseBdev3", 00:19:30.059 "aliases": [ 00:19:30.059 "c53a0015-8196-42e0-9761-bb19f5351a46" 00:19:30.059 ], 00:19:30.059 "product_name": "Malloc disk", 00:19:30.059 "block_size": 512, 00:19:30.059 "num_blocks": 65536, 00:19:30.059 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:30.059 "assigned_rate_limits": { 00:19:30.059 "rw_ios_per_sec": 0, 00:19:30.059 "rw_mbytes_per_sec": 0, 00:19:30.059 "r_mbytes_per_sec": 0, 00:19:30.059 "w_mbytes_per_sec": 0 00:19:30.059 }, 00:19:30.059 "claimed": false, 00:19:30.059 "zoned": false, 00:19:30.059 "supported_io_types": { 00:19:30.059 "read": true, 00:19:30.059 "write": true, 00:19:30.059 "unmap": true, 00:19:30.059 "flush": true, 00:19:30.059 "reset": true, 00:19:30.059 "nvme_admin": false, 00:19:30.059 "nvme_io": false, 00:19:30.059 "nvme_io_md": false, 00:19:30.059 "write_zeroes": true, 00:19:30.059 "zcopy": true, 00:19:30.059 "get_zone_info": false, 00:19:30.059 "zone_management": false, 00:19:30.059 "zone_append": false, 00:19:30.059 "compare": false, 00:19:30.059 "compare_and_write": false, 00:19:30.059 "abort": true, 00:19:30.059 "seek_hole": false, 00:19:30.059 "seek_data": false, 00:19:30.059 "copy": true, 00:19:30.059 "nvme_iov_md": false 00:19:30.059 }, 00:19:30.059 "memory_domains": [ 00:19:30.059 { 00:19:30.059 "dma_device_id": "system", 00:19:30.059 "dma_device_type": 1 00:19:30.059 }, 00:19:30.059 { 00:19:30.059 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.059 "dma_device_type": 2 00:19:30.059 } 00:19:30.059 ], 00:19:30.059 "driver_specific": {} 00:19:30.059 } 00:19:30.059 ] 00:19:30.059 12:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:30.059 12:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:30.059 12:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:30.059 12:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:30.318 BaseBdev4 00:19:30.318 12:00:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:30.318 12:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:30.318 12:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:30.318 12:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:30.318 12:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:30.318 12:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:30.318 12:00:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:30.577 12:00:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:30.837 [ 00:19:30.837 { 00:19:30.837 "name": "BaseBdev4", 00:19:30.837 "aliases": [ 00:19:30.837 "de6271fa-6304-47ba-ae3a-6fd1876052a2" 00:19:30.837 ], 00:19:30.837 "product_name": "Malloc disk", 00:19:30.837 "block_size": 512, 00:19:30.837 "num_blocks": 65536, 00:19:30.837 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:30.837 "assigned_rate_limits": { 00:19:30.837 "rw_ios_per_sec": 0, 00:19:30.837 "rw_mbytes_per_sec": 0, 00:19:30.837 "r_mbytes_per_sec": 0, 00:19:30.837 "w_mbytes_per_sec": 0 00:19:30.837 }, 00:19:30.837 "claimed": false, 00:19:30.837 "zoned": false, 00:19:30.837 "supported_io_types": { 00:19:30.837 "read": true, 00:19:30.837 "write": true, 00:19:30.837 "unmap": true, 00:19:30.837 "flush": true, 00:19:30.837 "reset": true, 00:19:30.837 "nvme_admin": false, 00:19:30.837 "nvme_io": false, 00:19:30.837 "nvme_io_md": false, 00:19:30.837 "write_zeroes": true, 00:19:30.837 "zcopy": true, 00:19:30.837 "get_zone_info": false, 00:19:30.837 "zone_management": false, 00:19:30.837 "zone_append": false, 00:19:30.837 "compare": false, 00:19:30.837 "compare_and_write": false, 00:19:30.837 "abort": true, 00:19:30.837 "seek_hole": false, 00:19:30.837 "seek_data": false, 00:19:30.837 "copy": true, 00:19:30.837 "nvme_iov_md": false 00:19:30.837 }, 00:19:30.837 "memory_domains": [ 00:19:30.837 { 00:19:30.837 "dma_device_id": "system", 00:19:30.837 "dma_device_type": 1 00:19:30.837 }, 00:19:30.837 { 00:19:30.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.837 "dma_device_type": 2 00:19:30.837 } 00:19:30.837 ], 00:19:30.837 "driver_specific": {} 00:19:30.837 } 00:19:30.837 ] 00:19:30.837 12:00:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:30.837 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:30.837 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:30.837 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:31.096 [2024-07-15 12:00:44.513691] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:31.096 [2024-07-15 12:00:44.513732] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:31.096 [2024-07-15 12:00:44.513750] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:31.096 [2024-07-15 12:00:44.515060] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:31.096 [2024-07-15 12:00:44.515100] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:31.096 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:31.355 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:31.355 "name": "Existed_Raid", 00:19:31.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.355 "strip_size_kb": 64, 00:19:31.355 "state": "configuring", 00:19:31.355 "raid_level": "raid0", 00:19:31.355 "superblock": false, 00:19:31.355 "num_base_bdevs": 4, 00:19:31.355 "num_base_bdevs_discovered": 3, 00:19:31.355 "num_base_bdevs_operational": 4, 00:19:31.355 "base_bdevs_list": [ 00:19:31.355 { 00:19:31.355 "name": "BaseBdev1", 00:19:31.355 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:31.355 "is_configured": false, 00:19:31.355 "data_offset": 0, 00:19:31.355 "data_size": 0 00:19:31.355 }, 00:19:31.355 { 00:19:31.355 "name": "BaseBdev2", 00:19:31.355 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:31.355 "is_configured": true, 00:19:31.355 "data_offset": 0, 00:19:31.355 "data_size": 65536 00:19:31.355 }, 00:19:31.355 { 00:19:31.355 "name": "BaseBdev3", 00:19:31.355 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:31.355 "is_configured": true, 00:19:31.355 "data_offset": 0, 00:19:31.355 "data_size": 65536 00:19:31.355 }, 00:19:31.355 { 00:19:31.355 "name": "BaseBdev4", 00:19:31.355 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:31.355 "is_configured": true, 00:19:31.355 "data_offset": 0, 00:19:31.355 "data_size": 65536 00:19:31.355 } 00:19:31.355 ] 00:19:31.355 }' 00:19:31.355 12:00:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:31.355 12:00:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:31.921 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:32.180 [2024-07-15 12:00:45.644656] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:32.180 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:32.439 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:32.439 "name": "Existed_Raid", 00:19:32.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.439 "strip_size_kb": 64, 00:19:32.439 "state": "configuring", 00:19:32.439 "raid_level": "raid0", 00:19:32.439 "superblock": false, 00:19:32.439 "num_base_bdevs": 4, 00:19:32.439 "num_base_bdevs_discovered": 2, 00:19:32.439 "num_base_bdevs_operational": 4, 00:19:32.439 "base_bdevs_list": [ 00:19:32.439 { 00:19:32.439 "name": "BaseBdev1", 00:19:32.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:32.439 "is_configured": false, 00:19:32.439 "data_offset": 0, 00:19:32.439 "data_size": 0 00:19:32.439 }, 00:19:32.439 { 00:19:32.439 "name": null, 00:19:32.439 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:32.439 "is_configured": false, 00:19:32.439 "data_offset": 0, 00:19:32.439 "data_size": 65536 00:19:32.439 }, 00:19:32.439 { 00:19:32.439 "name": "BaseBdev3", 00:19:32.439 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:32.439 "is_configured": true, 00:19:32.439 "data_offset": 0, 00:19:32.439 "data_size": 65536 00:19:32.439 }, 00:19:32.439 { 00:19:32.439 "name": "BaseBdev4", 00:19:32.439 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:32.439 "is_configured": true, 00:19:32.439 "data_offset": 0, 00:19:32.439 "data_size": 65536 00:19:32.439 } 00:19:32.439 ] 00:19:32.439 }' 00:19:32.439 12:00:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:32.439 12:00:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:33.005 12:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.005 12:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:33.263 12:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:33.263 12:00:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:33.522 [2024-07-15 12:00:46.999621] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:33.522 BaseBdev1 00:19:33.522 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:33.522 12:00:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:33.522 12:00:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:33.522 12:00:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:33.522 12:00:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:33.522 12:00:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:33.522 12:00:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:33.781 12:00:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:34.040 [ 00:19:34.040 { 00:19:34.040 "name": "BaseBdev1", 00:19:34.040 "aliases": [ 00:19:34.040 "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8" 00:19:34.040 ], 00:19:34.040 "product_name": "Malloc disk", 00:19:34.040 "block_size": 512, 00:19:34.040 "num_blocks": 65536, 00:19:34.040 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:34.040 "assigned_rate_limits": { 00:19:34.040 "rw_ios_per_sec": 0, 00:19:34.040 "rw_mbytes_per_sec": 0, 00:19:34.040 "r_mbytes_per_sec": 0, 00:19:34.040 "w_mbytes_per_sec": 0 00:19:34.040 }, 00:19:34.040 "claimed": true, 00:19:34.040 "claim_type": "exclusive_write", 00:19:34.040 "zoned": false, 00:19:34.040 "supported_io_types": { 00:19:34.040 "read": true, 00:19:34.040 "write": true, 00:19:34.040 "unmap": true, 00:19:34.040 "flush": true, 00:19:34.040 "reset": true, 00:19:34.040 "nvme_admin": false, 00:19:34.040 "nvme_io": false, 00:19:34.040 "nvme_io_md": false, 00:19:34.040 "write_zeroes": true, 00:19:34.040 "zcopy": true, 00:19:34.040 "get_zone_info": false, 00:19:34.040 "zone_management": false, 00:19:34.040 "zone_append": false, 00:19:34.040 "compare": false, 00:19:34.040 "compare_and_write": false, 00:19:34.040 "abort": true, 00:19:34.040 "seek_hole": false, 00:19:34.040 "seek_data": false, 00:19:34.040 "copy": true, 00:19:34.040 "nvme_iov_md": false 00:19:34.040 }, 00:19:34.040 "memory_domains": [ 00:19:34.040 { 00:19:34.040 "dma_device_id": "system", 00:19:34.040 "dma_device_type": 1 00:19:34.040 }, 00:19:34.040 { 00:19:34.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:34.040 "dma_device_type": 2 00:19:34.040 } 00:19:34.040 ], 00:19:34.040 "driver_specific": {} 00:19:34.040 } 00:19:34.040 ] 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.040 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:34.299 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.299 "name": "Existed_Raid", 00:19:34.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:34.299 "strip_size_kb": 64, 00:19:34.299 "state": "configuring", 00:19:34.299 "raid_level": "raid0", 00:19:34.299 "superblock": false, 00:19:34.299 "num_base_bdevs": 4, 00:19:34.299 "num_base_bdevs_discovered": 3, 00:19:34.299 "num_base_bdevs_operational": 4, 00:19:34.299 "base_bdevs_list": [ 00:19:34.299 { 00:19:34.299 "name": "BaseBdev1", 00:19:34.299 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:34.299 "is_configured": true, 00:19:34.299 "data_offset": 0, 00:19:34.299 "data_size": 65536 00:19:34.299 }, 00:19:34.299 { 00:19:34.299 "name": null, 00:19:34.299 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:34.299 "is_configured": false, 00:19:34.299 "data_offset": 0, 00:19:34.299 "data_size": 65536 00:19:34.299 }, 00:19:34.299 { 00:19:34.299 "name": "BaseBdev3", 00:19:34.299 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:34.299 "is_configured": true, 00:19:34.299 "data_offset": 0, 00:19:34.299 "data_size": 65536 00:19:34.299 }, 00:19:34.299 { 00:19:34.299 "name": "BaseBdev4", 00:19:34.299 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:34.299 "is_configured": true, 00:19:34.299 "data_offset": 0, 00:19:34.299 "data_size": 65536 00:19:34.299 } 00:19:34.299 ] 00:19:34.299 }' 00:19:34.299 12:00:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.299 12:00:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:34.865 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.865 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:35.123 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:35.123 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:35.381 [2024-07-15 12:00:48.780348] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.381 12:00:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.639 12:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.639 "name": "Existed_Raid", 00:19:35.639 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.639 "strip_size_kb": 64, 00:19:35.639 "state": "configuring", 00:19:35.639 "raid_level": "raid0", 00:19:35.639 "superblock": false, 00:19:35.639 "num_base_bdevs": 4, 00:19:35.639 "num_base_bdevs_discovered": 2, 00:19:35.639 "num_base_bdevs_operational": 4, 00:19:35.639 "base_bdevs_list": [ 00:19:35.639 { 00:19:35.639 "name": "BaseBdev1", 00:19:35.639 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:35.639 "is_configured": true, 00:19:35.639 "data_offset": 0, 00:19:35.639 "data_size": 65536 00:19:35.639 }, 00:19:35.639 { 00:19:35.639 "name": null, 00:19:35.639 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:35.639 "is_configured": false, 00:19:35.639 "data_offset": 0, 00:19:35.639 "data_size": 65536 00:19:35.639 }, 00:19:35.639 { 00:19:35.639 "name": null, 00:19:35.639 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:35.639 "is_configured": false, 00:19:35.639 "data_offset": 0, 00:19:35.639 "data_size": 65536 00:19:35.639 }, 00:19:35.639 { 00:19:35.639 "name": "BaseBdev4", 00:19:35.639 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:35.639 "is_configured": true, 00:19:35.639 "data_offset": 0, 00:19:35.639 "data_size": 65536 00:19:35.639 } 00:19:35.639 ] 00:19:35.639 }' 00:19:35.639 12:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.639 12:00:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:36.205 12:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:36.205 12:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.463 12:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:36.463 12:00:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:36.721 [2024-07-15 12:00:50.111920] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.721 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:36.980 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.980 "name": "Existed_Raid", 00:19:36.980 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.980 "strip_size_kb": 64, 00:19:36.980 "state": "configuring", 00:19:36.980 "raid_level": "raid0", 00:19:36.980 "superblock": false, 00:19:36.980 "num_base_bdevs": 4, 00:19:36.980 "num_base_bdevs_discovered": 3, 00:19:36.980 "num_base_bdevs_operational": 4, 00:19:36.980 "base_bdevs_list": [ 00:19:36.980 { 00:19:36.980 "name": "BaseBdev1", 00:19:36.980 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:36.980 "is_configured": true, 00:19:36.980 "data_offset": 0, 00:19:36.980 "data_size": 65536 00:19:36.980 }, 00:19:36.980 { 00:19:36.980 "name": null, 00:19:36.980 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:36.980 "is_configured": false, 00:19:36.980 "data_offset": 0, 00:19:36.980 "data_size": 65536 00:19:36.980 }, 00:19:36.980 { 00:19:36.980 "name": "BaseBdev3", 00:19:36.980 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:36.980 "is_configured": true, 00:19:36.980 "data_offset": 0, 00:19:36.980 "data_size": 65536 00:19:36.980 }, 00:19:36.980 { 00:19:36.980 "name": "BaseBdev4", 00:19:36.980 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:36.980 "is_configured": true, 00:19:36.980 "data_offset": 0, 00:19:36.980 "data_size": 65536 00:19:36.980 } 00:19:36.980 ] 00:19:36.980 }' 00:19:36.980 12:00:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.980 12:00:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.549 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:37.549 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.808 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:37.808 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:38.085 [2024-07-15 12:00:51.471516] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.085 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:38.344 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:38.344 "name": "Existed_Raid", 00:19:38.344 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:38.344 "strip_size_kb": 64, 00:19:38.344 "state": "configuring", 00:19:38.344 "raid_level": "raid0", 00:19:38.344 "superblock": false, 00:19:38.344 "num_base_bdevs": 4, 00:19:38.344 "num_base_bdevs_discovered": 2, 00:19:38.344 "num_base_bdevs_operational": 4, 00:19:38.344 "base_bdevs_list": [ 00:19:38.344 { 00:19:38.344 "name": null, 00:19:38.344 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:38.344 "is_configured": false, 00:19:38.344 "data_offset": 0, 00:19:38.344 "data_size": 65536 00:19:38.344 }, 00:19:38.344 { 00:19:38.344 "name": null, 00:19:38.344 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:38.344 "is_configured": false, 00:19:38.344 "data_offset": 0, 00:19:38.344 "data_size": 65536 00:19:38.344 }, 00:19:38.344 { 00:19:38.344 "name": "BaseBdev3", 00:19:38.344 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:38.344 "is_configured": true, 00:19:38.344 "data_offset": 0, 00:19:38.344 "data_size": 65536 00:19:38.344 }, 00:19:38.344 { 00:19:38.344 "name": "BaseBdev4", 00:19:38.344 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:38.344 "is_configured": true, 00:19:38.344 "data_offset": 0, 00:19:38.344 "data_size": 65536 00:19:38.344 } 00:19:38.344 ] 00:19:38.344 }' 00:19:38.344 12:00:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:38.344 12:00:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:38.910 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:38.910 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:39.169 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:39.169 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:39.169 [2024-07-15 12:00:52.763284] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.427 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.428 12:00:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:39.686 12:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.686 "name": "Existed_Raid", 00:19:39.686 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.686 "strip_size_kb": 64, 00:19:39.686 "state": "configuring", 00:19:39.686 "raid_level": "raid0", 00:19:39.686 "superblock": false, 00:19:39.686 "num_base_bdevs": 4, 00:19:39.686 "num_base_bdevs_discovered": 3, 00:19:39.686 "num_base_bdevs_operational": 4, 00:19:39.686 "base_bdevs_list": [ 00:19:39.686 { 00:19:39.686 "name": null, 00:19:39.686 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:39.686 "is_configured": false, 00:19:39.686 "data_offset": 0, 00:19:39.686 "data_size": 65536 00:19:39.686 }, 00:19:39.686 { 00:19:39.686 "name": "BaseBdev2", 00:19:39.686 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:39.686 "is_configured": true, 00:19:39.686 "data_offset": 0, 00:19:39.686 "data_size": 65536 00:19:39.686 }, 00:19:39.686 { 00:19:39.686 "name": "BaseBdev3", 00:19:39.686 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:39.686 "is_configured": true, 00:19:39.686 "data_offset": 0, 00:19:39.686 "data_size": 65536 00:19:39.686 }, 00:19:39.686 { 00:19:39.686 "name": "BaseBdev4", 00:19:39.686 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:39.686 "is_configured": true, 00:19:39.686 "data_offset": 0, 00:19:39.686 "data_size": 65536 00:19:39.686 } 00:19:39.686 ] 00:19:39.686 }' 00:19:39.686 12:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.686 12:00:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:40.256 12:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.256 12:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:40.256 12:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:40.256 12:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.256 12:00:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:40.516 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u cbc37859-d62e-48cc-b09c-96fbbb5dd4f8 00:19:40.776 [2024-07-15 12:00:54.302670] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:40.776 [2024-07-15 12:00:54.302715] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20eb240 00:19:40.776 [2024-07-15 12:00:54.302724] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:40.776 [2024-07-15 12:00:54.302917] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20e4c20 00:19:40.776 [2024-07-15 12:00:54.303030] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20eb240 00:19:40.776 [2024-07-15 12:00:54.303040] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x20eb240 00:19:40.776 [2024-07-15 12:00:54.303199] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:40.776 NewBaseBdev 00:19:40.776 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:40.776 12:00:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:19:40.776 12:00:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:40.776 12:00:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:19:40.776 12:00:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:40.776 12:00:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:40.776 12:00:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:41.035 12:00:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:41.298 [ 00:19:41.298 { 00:19:41.298 "name": "NewBaseBdev", 00:19:41.298 "aliases": [ 00:19:41.298 "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8" 00:19:41.298 ], 00:19:41.298 "product_name": "Malloc disk", 00:19:41.298 "block_size": 512, 00:19:41.298 "num_blocks": 65536, 00:19:41.298 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:41.298 "assigned_rate_limits": { 00:19:41.298 "rw_ios_per_sec": 0, 00:19:41.298 "rw_mbytes_per_sec": 0, 00:19:41.298 "r_mbytes_per_sec": 0, 00:19:41.298 "w_mbytes_per_sec": 0 00:19:41.298 }, 00:19:41.298 "claimed": true, 00:19:41.298 "claim_type": "exclusive_write", 00:19:41.298 "zoned": false, 00:19:41.298 "supported_io_types": { 00:19:41.298 "read": true, 00:19:41.298 "write": true, 00:19:41.298 "unmap": true, 00:19:41.298 "flush": true, 00:19:41.298 "reset": true, 00:19:41.298 "nvme_admin": false, 00:19:41.298 "nvme_io": false, 00:19:41.298 "nvme_io_md": false, 00:19:41.298 "write_zeroes": true, 00:19:41.298 "zcopy": true, 00:19:41.298 "get_zone_info": false, 00:19:41.298 "zone_management": false, 00:19:41.298 "zone_append": false, 00:19:41.298 "compare": false, 00:19:41.298 "compare_and_write": false, 00:19:41.298 "abort": true, 00:19:41.298 "seek_hole": false, 00:19:41.298 "seek_data": false, 00:19:41.298 "copy": true, 00:19:41.298 "nvme_iov_md": false 00:19:41.298 }, 00:19:41.298 "memory_domains": [ 00:19:41.298 { 00:19:41.298 "dma_device_id": "system", 00:19:41.298 "dma_device_type": 1 00:19:41.298 }, 00:19:41.298 { 00:19:41.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.298 "dma_device_type": 2 00:19:41.298 } 00:19:41.298 ], 00:19:41.298 "driver_specific": {} 00:19:41.298 } 00:19:41.298 ] 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.298 12:00:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:41.559 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.559 "name": "Existed_Raid", 00:19:41.559 "uuid": "b1f77b86-2625-4541-bc28-2f37743e036a", 00:19:41.559 "strip_size_kb": 64, 00:19:41.559 "state": "online", 00:19:41.559 "raid_level": "raid0", 00:19:41.559 "superblock": false, 00:19:41.559 "num_base_bdevs": 4, 00:19:41.559 "num_base_bdevs_discovered": 4, 00:19:41.559 "num_base_bdevs_operational": 4, 00:19:41.559 "base_bdevs_list": [ 00:19:41.559 { 00:19:41.559 "name": "NewBaseBdev", 00:19:41.559 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:41.559 "is_configured": true, 00:19:41.559 "data_offset": 0, 00:19:41.559 "data_size": 65536 00:19:41.559 }, 00:19:41.559 { 00:19:41.559 "name": "BaseBdev2", 00:19:41.559 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:41.559 "is_configured": true, 00:19:41.559 "data_offset": 0, 00:19:41.559 "data_size": 65536 00:19:41.559 }, 00:19:41.559 { 00:19:41.559 "name": "BaseBdev3", 00:19:41.559 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:41.559 "is_configured": true, 00:19:41.559 "data_offset": 0, 00:19:41.559 "data_size": 65536 00:19:41.559 }, 00:19:41.559 { 00:19:41.559 "name": "BaseBdev4", 00:19:41.559 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:41.559 "is_configured": true, 00:19:41.559 "data_offset": 0, 00:19:41.559 "data_size": 65536 00:19:41.559 } 00:19:41.559 ] 00:19:41.559 }' 00:19:41.559 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.559 12:00:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.126 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:42.126 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:42.126 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:42.126 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:42.126 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:42.126 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:42.126 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:42.126 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:42.385 [2024-07-15 12:00:55.867147] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:42.385 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:42.385 "name": "Existed_Raid", 00:19:42.386 "aliases": [ 00:19:42.386 "b1f77b86-2625-4541-bc28-2f37743e036a" 00:19:42.386 ], 00:19:42.386 "product_name": "Raid Volume", 00:19:42.386 "block_size": 512, 00:19:42.386 "num_blocks": 262144, 00:19:42.386 "uuid": "b1f77b86-2625-4541-bc28-2f37743e036a", 00:19:42.386 "assigned_rate_limits": { 00:19:42.386 "rw_ios_per_sec": 0, 00:19:42.386 "rw_mbytes_per_sec": 0, 00:19:42.386 "r_mbytes_per_sec": 0, 00:19:42.386 "w_mbytes_per_sec": 0 00:19:42.386 }, 00:19:42.386 "claimed": false, 00:19:42.386 "zoned": false, 00:19:42.386 "supported_io_types": { 00:19:42.386 "read": true, 00:19:42.386 "write": true, 00:19:42.386 "unmap": true, 00:19:42.386 "flush": true, 00:19:42.386 "reset": true, 00:19:42.386 "nvme_admin": false, 00:19:42.386 "nvme_io": false, 00:19:42.386 "nvme_io_md": false, 00:19:42.386 "write_zeroes": true, 00:19:42.386 "zcopy": false, 00:19:42.386 "get_zone_info": false, 00:19:42.386 "zone_management": false, 00:19:42.386 "zone_append": false, 00:19:42.386 "compare": false, 00:19:42.386 "compare_and_write": false, 00:19:42.386 "abort": false, 00:19:42.386 "seek_hole": false, 00:19:42.386 "seek_data": false, 00:19:42.386 "copy": false, 00:19:42.386 "nvme_iov_md": false 00:19:42.386 }, 00:19:42.386 "memory_domains": [ 00:19:42.386 { 00:19:42.386 "dma_device_id": "system", 00:19:42.386 "dma_device_type": 1 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.386 "dma_device_type": 2 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "dma_device_id": "system", 00:19:42.386 "dma_device_type": 1 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.386 "dma_device_type": 2 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "dma_device_id": "system", 00:19:42.386 "dma_device_type": 1 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.386 "dma_device_type": 2 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "dma_device_id": "system", 00:19:42.386 "dma_device_type": 1 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.386 "dma_device_type": 2 00:19:42.386 } 00:19:42.386 ], 00:19:42.386 "driver_specific": { 00:19:42.386 "raid": { 00:19:42.386 "uuid": "b1f77b86-2625-4541-bc28-2f37743e036a", 00:19:42.386 "strip_size_kb": 64, 00:19:42.386 "state": "online", 00:19:42.386 "raid_level": "raid0", 00:19:42.386 "superblock": false, 00:19:42.386 "num_base_bdevs": 4, 00:19:42.386 "num_base_bdevs_discovered": 4, 00:19:42.386 "num_base_bdevs_operational": 4, 00:19:42.386 "base_bdevs_list": [ 00:19:42.386 { 00:19:42.386 "name": "NewBaseBdev", 00:19:42.386 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:42.386 "is_configured": true, 00:19:42.386 "data_offset": 0, 00:19:42.386 "data_size": 65536 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "name": "BaseBdev2", 00:19:42.386 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:42.386 "is_configured": true, 00:19:42.386 "data_offset": 0, 00:19:42.386 "data_size": 65536 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "name": "BaseBdev3", 00:19:42.386 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:42.386 "is_configured": true, 00:19:42.386 "data_offset": 0, 00:19:42.386 "data_size": 65536 00:19:42.386 }, 00:19:42.386 { 00:19:42.386 "name": "BaseBdev4", 00:19:42.386 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:42.386 "is_configured": true, 00:19:42.386 "data_offset": 0, 00:19:42.386 "data_size": 65536 00:19:42.386 } 00:19:42.386 ] 00:19:42.386 } 00:19:42.386 } 00:19:42.386 }' 00:19:42.386 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:42.386 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:42.386 BaseBdev2 00:19:42.386 BaseBdev3 00:19:42.386 BaseBdev4' 00:19:42.386 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.386 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:42.386 12:00:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.645 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.645 "name": "NewBaseBdev", 00:19:42.645 "aliases": [ 00:19:42.645 "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8" 00:19:42.645 ], 00:19:42.645 "product_name": "Malloc disk", 00:19:42.645 "block_size": 512, 00:19:42.645 "num_blocks": 65536, 00:19:42.645 "uuid": "cbc37859-d62e-48cc-b09c-96fbbb5dd4f8", 00:19:42.645 "assigned_rate_limits": { 00:19:42.645 "rw_ios_per_sec": 0, 00:19:42.645 "rw_mbytes_per_sec": 0, 00:19:42.645 "r_mbytes_per_sec": 0, 00:19:42.645 "w_mbytes_per_sec": 0 00:19:42.645 }, 00:19:42.645 "claimed": true, 00:19:42.645 "claim_type": "exclusive_write", 00:19:42.645 "zoned": false, 00:19:42.645 "supported_io_types": { 00:19:42.645 "read": true, 00:19:42.645 "write": true, 00:19:42.645 "unmap": true, 00:19:42.645 "flush": true, 00:19:42.645 "reset": true, 00:19:42.645 "nvme_admin": false, 00:19:42.645 "nvme_io": false, 00:19:42.645 "nvme_io_md": false, 00:19:42.645 "write_zeroes": true, 00:19:42.645 "zcopy": true, 00:19:42.645 "get_zone_info": false, 00:19:42.645 "zone_management": false, 00:19:42.645 "zone_append": false, 00:19:42.645 "compare": false, 00:19:42.645 "compare_and_write": false, 00:19:42.645 "abort": true, 00:19:42.645 "seek_hole": false, 00:19:42.645 "seek_data": false, 00:19:42.645 "copy": true, 00:19:42.645 "nvme_iov_md": false 00:19:42.645 }, 00:19:42.645 "memory_domains": [ 00:19:42.645 { 00:19:42.645 "dma_device_id": "system", 00:19:42.645 "dma_device_type": 1 00:19:42.645 }, 00:19:42.645 { 00:19:42.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.645 "dma_device_type": 2 00:19:42.645 } 00:19:42.645 ], 00:19:42.645 "driver_specific": {} 00:19:42.645 }' 00:19:42.645 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.645 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.904 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.904 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.904 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.904 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:42.904 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.904 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.904 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.904 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.904 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.163 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.163 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.163 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:43.163 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.421 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.421 "name": "BaseBdev2", 00:19:43.421 "aliases": [ 00:19:43.421 "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0" 00:19:43.421 ], 00:19:43.421 "product_name": "Malloc disk", 00:19:43.421 "block_size": 512, 00:19:43.421 "num_blocks": 65536, 00:19:43.421 "uuid": "2e569db6-1a68-4e7f-95a3-5d36e4c52ae0", 00:19:43.421 "assigned_rate_limits": { 00:19:43.421 "rw_ios_per_sec": 0, 00:19:43.421 "rw_mbytes_per_sec": 0, 00:19:43.421 "r_mbytes_per_sec": 0, 00:19:43.421 "w_mbytes_per_sec": 0 00:19:43.421 }, 00:19:43.421 "claimed": true, 00:19:43.421 "claim_type": "exclusive_write", 00:19:43.421 "zoned": false, 00:19:43.421 "supported_io_types": { 00:19:43.421 "read": true, 00:19:43.421 "write": true, 00:19:43.421 "unmap": true, 00:19:43.421 "flush": true, 00:19:43.421 "reset": true, 00:19:43.421 "nvme_admin": false, 00:19:43.421 "nvme_io": false, 00:19:43.421 "nvme_io_md": false, 00:19:43.421 "write_zeroes": true, 00:19:43.421 "zcopy": true, 00:19:43.421 "get_zone_info": false, 00:19:43.421 "zone_management": false, 00:19:43.421 "zone_append": false, 00:19:43.421 "compare": false, 00:19:43.421 "compare_and_write": false, 00:19:43.421 "abort": true, 00:19:43.421 "seek_hole": false, 00:19:43.421 "seek_data": false, 00:19:43.421 "copy": true, 00:19:43.421 "nvme_iov_md": false 00:19:43.421 }, 00:19:43.421 "memory_domains": [ 00:19:43.421 { 00:19:43.421 "dma_device_id": "system", 00:19:43.421 "dma_device_type": 1 00:19:43.421 }, 00:19:43.421 { 00:19:43.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.421 "dma_device_type": 2 00:19:43.421 } 00:19:43.421 ], 00:19:43.421 "driver_specific": {} 00:19:43.421 }' 00:19:43.421 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.421 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.421 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.421 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.421 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.421 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.421 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.421 12:00:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.679 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.679 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.679 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.679 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.679 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:43.679 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:43.679 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.950 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.950 "name": "BaseBdev3", 00:19:43.951 "aliases": [ 00:19:43.951 "c53a0015-8196-42e0-9761-bb19f5351a46" 00:19:43.951 ], 00:19:43.951 "product_name": "Malloc disk", 00:19:43.951 "block_size": 512, 00:19:43.951 "num_blocks": 65536, 00:19:43.951 "uuid": "c53a0015-8196-42e0-9761-bb19f5351a46", 00:19:43.951 "assigned_rate_limits": { 00:19:43.951 "rw_ios_per_sec": 0, 00:19:43.951 "rw_mbytes_per_sec": 0, 00:19:43.951 "r_mbytes_per_sec": 0, 00:19:43.951 "w_mbytes_per_sec": 0 00:19:43.951 }, 00:19:43.951 "claimed": true, 00:19:43.951 "claim_type": "exclusive_write", 00:19:43.951 "zoned": false, 00:19:43.951 "supported_io_types": { 00:19:43.951 "read": true, 00:19:43.951 "write": true, 00:19:43.951 "unmap": true, 00:19:43.951 "flush": true, 00:19:43.951 "reset": true, 00:19:43.951 "nvme_admin": false, 00:19:43.951 "nvme_io": false, 00:19:43.951 "nvme_io_md": false, 00:19:43.951 "write_zeroes": true, 00:19:43.951 "zcopy": true, 00:19:43.951 "get_zone_info": false, 00:19:43.951 "zone_management": false, 00:19:43.951 "zone_append": false, 00:19:43.951 "compare": false, 00:19:43.951 "compare_and_write": false, 00:19:43.951 "abort": true, 00:19:43.951 "seek_hole": false, 00:19:43.951 "seek_data": false, 00:19:43.951 "copy": true, 00:19:43.951 "nvme_iov_md": false 00:19:43.951 }, 00:19:43.951 "memory_domains": [ 00:19:43.951 { 00:19:43.951 "dma_device_id": "system", 00:19:43.951 "dma_device_type": 1 00:19:43.951 }, 00:19:43.951 { 00:19:43.951 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.951 "dma_device_type": 2 00:19:43.951 } 00:19:43.951 ], 00:19:43.951 "driver_specific": {} 00:19:43.951 }' 00:19:43.951 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.951 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.951 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.951 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.952 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:44.216 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:44.475 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:44.475 "name": "BaseBdev4", 00:19:44.475 "aliases": [ 00:19:44.475 "de6271fa-6304-47ba-ae3a-6fd1876052a2" 00:19:44.475 ], 00:19:44.475 "product_name": "Malloc disk", 00:19:44.475 "block_size": 512, 00:19:44.475 "num_blocks": 65536, 00:19:44.475 "uuid": "de6271fa-6304-47ba-ae3a-6fd1876052a2", 00:19:44.475 "assigned_rate_limits": { 00:19:44.475 "rw_ios_per_sec": 0, 00:19:44.475 "rw_mbytes_per_sec": 0, 00:19:44.475 "r_mbytes_per_sec": 0, 00:19:44.475 "w_mbytes_per_sec": 0 00:19:44.475 }, 00:19:44.475 "claimed": true, 00:19:44.475 "claim_type": "exclusive_write", 00:19:44.475 "zoned": false, 00:19:44.475 "supported_io_types": { 00:19:44.475 "read": true, 00:19:44.475 "write": true, 00:19:44.475 "unmap": true, 00:19:44.475 "flush": true, 00:19:44.475 "reset": true, 00:19:44.475 "nvme_admin": false, 00:19:44.475 "nvme_io": false, 00:19:44.475 "nvme_io_md": false, 00:19:44.475 "write_zeroes": true, 00:19:44.475 "zcopy": true, 00:19:44.475 "get_zone_info": false, 00:19:44.475 "zone_management": false, 00:19:44.475 "zone_append": false, 00:19:44.475 "compare": false, 00:19:44.475 "compare_and_write": false, 00:19:44.475 "abort": true, 00:19:44.475 "seek_hole": false, 00:19:44.475 "seek_data": false, 00:19:44.475 "copy": true, 00:19:44.475 "nvme_iov_md": false 00:19:44.475 }, 00:19:44.475 "memory_domains": [ 00:19:44.475 { 00:19:44.475 "dma_device_id": "system", 00:19:44.475 "dma_device_type": 1 00:19:44.475 }, 00:19:44.475 { 00:19:44.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.475 "dma_device_type": 2 00:19:44.475 } 00:19:44.475 ], 00:19:44.475 "driver_specific": {} 00:19:44.475 }' 00:19:44.475 12:00:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.475 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:44.475 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:44.475 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.733 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:44.733 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:44.733 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.733 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:44.733 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:44.733 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.733 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:44.992 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:44.992 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:44.992 [2024-07-15 12:00:58.561944] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:44.992 [2024-07-15 12:00:58.561970] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:44.992 [2024-07-15 12:00:58.562025] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:44.992 [2024-07-15 12:00:58.562085] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:44.992 [2024-07-15 12:00:58.562097] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20eb240 name Existed_Raid, state offline 00:19:44.992 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1517580 00:19:44.992 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1517580 ']' 00:19:44.992 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1517580 00:19:44.992 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:19:45.257 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:19:45.257 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1517580 00:19:45.257 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:19:45.257 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:19:45.257 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1517580' 00:19:45.257 killing process with pid 1517580 00:19:45.257 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1517580 00:19:45.257 [2024-07-15 12:00:58.631858] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:45.257 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1517580 00:19:45.257 [2024-07-15 12:00:58.674931] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:45.516 00:19:45.516 real 0m32.410s 00:19:45.516 user 0m59.882s 00:19:45.516 sys 0m5.909s 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:45.516 ************************************ 00:19:45.516 END TEST raid_state_function_test 00:19:45.516 ************************************ 00:19:45.516 12:00:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:19:45.516 12:00:58 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:19:45.516 12:00:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:19:45.516 12:00:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:19:45.516 12:00:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:45.516 ************************************ 00:19:45.516 START TEST raid_state_function_test_sb 00:19:45.516 ************************************ 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid0 4 true 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1522300 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1522300' 00:19:45.516 Process raid pid: 1522300 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:45.516 12:00:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1522300 /var/tmp/spdk-raid.sock 00:19:45.516 12:00:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1522300 ']' 00:19:45.516 12:00:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:45.516 12:00:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:19:45.516 12:00:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:45.516 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:45.516 12:00:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:19:45.516 12:00:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:45.516 [2024-07-15 12:00:59.060268] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:19:45.516 [2024-07-15 12:00:59.060338] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:45.775 [2024-07-15 12:00:59.191426] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:45.775 [2024-07-15 12:00:59.292589] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:45.775 [2024-07-15 12:00:59.358154] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:45.775 [2024-07-15 12:00:59.358189] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:46.713 12:00:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:19:46.713 12:00:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:19:46.713 12:00:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:46.713 [2024-07-15 12:01:00.225135] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:46.713 [2024-07-15 12:01:00.225185] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:46.713 [2024-07-15 12:01:00.225196] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:46.713 [2024-07-15 12:01:00.225208] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:46.713 [2024-07-15 12:01:00.225217] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:46.713 [2024-07-15 12:01:00.225228] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:46.713 [2024-07-15 12:01:00.225241] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:46.713 [2024-07-15 12:01:00.225252] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.713 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:46.972 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.972 "name": "Existed_Raid", 00:19:46.972 "uuid": "c86faf88-4abf-43d3-a2cb-da54f1c9ac1a", 00:19:46.972 "strip_size_kb": 64, 00:19:46.972 "state": "configuring", 00:19:46.972 "raid_level": "raid0", 00:19:46.972 "superblock": true, 00:19:46.972 "num_base_bdevs": 4, 00:19:46.972 "num_base_bdevs_discovered": 0, 00:19:46.972 "num_base_bdevs_operational": 4, 00:19:46.972 "base_bdevs_list": [ 00:19:46.972 { 00:19:46.972 "name": "BaseBdev1", 00:19:46.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.972 "is_configured": false, 00:19:46.972 "data_offset": 0, 00:19:46.972 "data_size": 0 00:19:46.972 }, 00:19:46.972 { 00:19:46.972 "name": "BaseBdev2", 00:19:46.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.972 "is_configured": false, 00:19:46.972 "data_offset": 0, 00:19:46.972 "data_size": 0 00:19:46.972 }, 00:19:46.972 { 00:19:46.972 "name": "BaseBdev3", 00:19:46.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.972 "is_configured": false, 00:19:46.972 "data_offset": 0, 00:19:46.972 "data_size": 0 00:19:46.972 }, 00:19:46.972 { 00:19:46.972 "name": "BaseBdev4", 00:19:46.972 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.972 "is_configured": false, 00:19:46.972 "data_offset": 0, 00:19:46.972 "data_size": 0 00:19:46.972 } 00:19:46.972 ] 00:19:46.972 }' 00:19:46.972 12:01:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.972 12:01:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:47.539 12:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:47.798 [2024-07-15 12:01:01.311854] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:47.798 [2024-07-15 12:01:01.311887] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x242cb20 name Existed_Raid, state configuring 00:19:47.798 12:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:48.057 [2024-07-15 12:01:01.556532] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:48.057 [2024-07-15 12:01:01.556561] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:48.057 [2024-07-15 12:01:01.556571] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:48.057 [2024-07-15 12:01:01.556582] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:48.057 [2024-07-15 12:01:01.556591] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:48.057 [2024-07-15 12:01:01.556602] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:48.057 [2024-07-15 12:01:01.556611] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:48.057 [2024-07-15 12:01:01.556622] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:48.057 12:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:48.316 [2024-07-15 12:01:01.810926] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:48.316 BaseBdev1 00:19:48.316 12:01:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:48.316 12:01:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:19:48.316 12:01:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:48.316 12:01:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:48.316 12:01:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:48.316 12:01:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:48.316 12:01:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:48.575 12:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:48.834 [ 00:19:48.834 { 00:19:48.834 "name": "BaseBdev1", 00:19:48.834 "aliases": [ 00:19:48.834 "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b" 00:19:48.834 ], 00:19:48.834 "product_name": "Malloc disk", 00:19:48.834 "block_size": 512, 00:19:48.834 "num_blocks": 65536, 00:19:48.834 "uuid": "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b", 00:19:48.834 "assigned_rate_limits": { 00:19:48.834 "rw_ios_per_sec": 0, 00:19:48.834 "rw_mbytes_per_sec": 0, 00:19:48.834 "r_mbytes_per_sec": 0, 00:19:48.834 "w_mbytes_per_sec": 0 00:19:48.834 }, 00:19:48.834 "claimed": true, 00:19:48.834 "claim_type": "exclusive_write", 00:19:48.834 "zoned": false, 00:19:48.834 "supported_io_types": { 00:19:48.834 "read": true, 00:19:48.834 "write": true, 00:19:48.834 "unmap": true, 00:19:48.834 "flush": true, 00:19:48.834 "reset": true, 00:19:48.834 "nvme_admin": false, 00:19:48.834 "nvme_io": false, 00:19:48.834 "nvme_io_md": false, 00:19:48.834 "write_zeroes": true, 00:19:48.834 "zcopy": true, 00:19:48.834 "get_zone_info": false, 00:19:48.834 "zone_management": false, 00:19:48.834 "zone_append": false, 00:19:48.834 "compare": false, 00:19:48.834 "compare_and_write": false, 00:19:48.834 "abort": true, 00:19:48.834 "seek_hole": false, 00:19:48.834 "seek_data": false, 00:19:48.834 "copy": true, 00:19:48.834 "nvme_iov_md": false 00:19:48.834 }, 00:19:48.834 "memory_domains": [ 00:19:48.834 { 00:19:48.834 "dma_device_id": "system", 00:19:48.834 "dma_device_type": 1 00:19:48.834 }, 00:19:48.834 { 00:19:48.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.834 "dma_device_type": 2 00:19:48.834 } 00:19:48.834 ], 00:19:48.834 "driver_specific": {} 00:19:48.834 } 00:19:48.834 ] 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.834 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:49.094 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:49.094 "name": "Existed_Raid", 00:19:49.094 "uuid": "3aa39c33-e269-456c-b12e-b005a09d6dbb", 00:19:49.094 "strip_size_kb": 64, 00:19:49.094 "state": "configuring", 00:19:49.094 "raid_level": "raid0", 00:19:49.094 "superblock": true, 00:19:49.094 "num_base_bdevs": 4, 00:19:49.094 "num_base_bdevs_discovered": 1, 00:19:49.094 "num_base_bdevs_operational": 4, 00:19:49.094 "base_bdevs_list": [ 00:19:49.094 { 00:19:49.094 "name": "BaseBdev1", 00:19:49.094 "uuid": "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b", 00:19:49.094 "is_configured": true, 00:19:49.094 "data_offset": 2048, 00:19:49.094 "data_size": 63488 00:19:49.094 }, 00:19:49.094 { 00:19:49.094 "name": "BaseBdev2", 00:19:49.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.094 "is_configured": false, 00:19:49.094 "data_offset": 0, 00:19:49.094 "data_size": 0 00:19:49.094 }, 00:19:49.094 { 00:19:49.094 "name": "BaseBdev3", 00:19:49.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.094 "is_configured": false, 00:19:49.094 "data_offset": 0, 00:19:49.094 "data_size": 0 00:19:49.094 }, 00:19:49.094 { 00:19:49.094 "name": "BaseBdev4", 00:19:49.094 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:49.094 "is_configured": false, 00:19:49.094 "data_offset": 0, 00:19:49.094 "data_size": 0 00:19:49.094 } 00:19:49.094 ] 00:19:49.094 }' 00:19:49.094 12:01:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:49.094 12:01:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:49.664 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:49.922 [2024-07-15 12:01:03.270785] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:49.922 [2024-07-15 12:01:03.270823] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x242c390 name Existed_Raid, state configuring 00:19:49.922 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:50.181 [2024-07-15 12:01:03.531521] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:50.181 [2024-07-15 12:01:03.532966] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:50.181 [2024-07-15 12:01:03.533001] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:50.181 [2024-07-15 12:01:03.533011] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:50.181 [2024-07-15 12:01:03.533022] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:50.181 [2024-07-15 12:01:03.533032] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:50.181 [2024-07-15 12:01:03.533043] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:50.181 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.439 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.439 "name": "Existed_Raid", 00:19:50.439 "uuid": "68809798-b6af-492c-be5b-5ecdcd02d37b", 00:19:50.439 "strip_size_kb": 64, 00:19:50.439 "state": "configuring", 00:19:50.439 "raid_level": "raid0", 00:19:50.439 "superblock": true, 00:19:50.439 "num_base_bdevs": 4, 00:19:50.439 "num_base_bdevs_discovered": 1, 00:19:50.439 "num_base_bdevs_operational": 4, 00:19:50.439 "base_bdevs_list": [ 00:19:50.439 { 00:19:50.439 "name": "BaseBdev1", 00:19:50.439 "uuid": "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b", 00:19:50.439 "is_configured": true, 00:19:50.439 "data_offset": 2048, 00:19:50.439 "data_size": 63488 00:19:50.439 }, 00:19:50.439 { 00:19:50.439 "name": "BaseBdev2", 00:19:50.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.439 "is_configured": false, 00:19:50.439 "data_offset": 0, 00:19:50.439 "data_size": 0 00:19:50.439 }, 00:19:50.439 { 00:19:50.439 "name": "BaseBdev3", 00:19:50.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.439 "is_configured": false, 00:19:50.439 "data_offset": 0, 00:19:50.439 "data_size": 0 00:19:50.439 }, 00:19:50.439 { 00:19:50.439 "name": "BaseBdev4", 00:19:50.439 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:50.439 "is_configured": false, 00:19:50.439 "data_offset": 0, 00:19:50.439 "data_size": 0 00:19:50.439 } 00:19:50.439 ] 00:19:50.439 }' 00:19:50.439 12:01:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.439 12:01:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:51.008 12:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:51.267 [2024-07-15 12:01:04.650171] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:51.267 BaseBdev2 00:19:51.267 12:01:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:51.267 12:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:19:51.267 12:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:51.267 12:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:51.267 12:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:51.267 12:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:51.267 12:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:51.530 12:01:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:51.530 [ 00:19:51.530 { 00:19:51.530 "name": "BaseBdev2", 00:19:51.530 "aliases": [ 00:19:51.530 "1c510d82-5404-4d06-a0c2-9fc679bc931d" 00:19:51.530 ], 00:19:51.530 "product_name": "Malloc disk", 00:19:51.530 "block_size": 512, 00:19:51.530 "num_blocks": 65536, 00:19:51.530 "uuid": "1c510d82-5404-4d06-a0c2-9fc679bc931d", 00:19:51.530 "assigned_rate_limits": { 00:19:51.530 "rw_ios_per_sec": 0, 00:19:51.530 "rw_mbytes_per_sec": 0, 00:19:51.530 "r_mbytes_per_sec": 0, 00:19:51.530 "w_mbytes_per_sec": 0 00:19:51.530 }, 00:19:51.530 "claimed": true, 00:19:51.530 "claim_type": "exclusive_write", 00:19:51.530 "zoned": false, 00:19:51.530 "supported_io_types": { 00:19:51.530 "read": true, 00:19:51.530 "write": true, 00:19:51.530 "unmap": true, 00:19:51.530 "flush": true, 00:19:51.530 "reset": true, 00:19:51.530 "nvme_admin": false, 00:19:51.530 "nvme_io": false, 00:19:51.530 "nvme_io_md": false, 00:19:51.530 "write_zeroes": true, 00:19:51.530 "zcopy": true, 00:19:51.530 "get_zone_info": false, 00:19:51.530 "zone_management": false, 00:19:51.530 "zone_append": false, 00:19:51.530 "compare": false, 00:19:51.530 "compare_and_write": false, 00:19:51.530 "abort": true, 00:19:51.530 "seek_hole": false, 00:19:51.530 "seek_data": false, 00:19:51.530 "copy": true, 00:19:51.530 "nvme_iov_md": false 00:19:51.530 }, 00:19:51.530 "memory_domains": [ 00:19:51.530 { 00:19:51.530 "dma_device_id": "system", 00:19:51.530 "dma_device_type": 1 00:19:51.530 }, 00:19:51.530 { 00:19:51.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.530 "dma_device_type": 2 00:19:51.530 } 00:19:51.530 ], 00:19:51.530 "driver_specific": {} 00:19:51.530 } 00:19:51.530 ] 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.530 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.531 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.531 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.531 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.531 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:51.867 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.867 "name": "Existed_Raid", 00:19:51.867 "uuid": "68809798-b6af-492c-be5b-5ecdcd02d37b", 00:19:51.867 "strip_size_kb": 64, 00:19:51.867 "state": "configuring", 00:19:51.867 "raid_level": "raid0", 00:19:51.867 "superblock": true, 00:19:51.867 "num_base_bdevs": 4, 00:19:51.867 "num_base_bdevs_discovered": 2, 00:19:51.867 "num_base_bdevs_operational": 4, 00:19:51.867 "base_bdevs_list": [ 00:19:51.867 { 00:19:51.867 "name": "BaseBdev1", 00:19:51.867 "uuid": "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b", 00:19:51.867 "is_configured": true, 00:19:51.867 "data_offset": 2048, 00:19:51.867 "data_size": 63488 00:19:51.867 }, 00:19:51.867 { 00:19:51.867 "name": "BaseBdev2", 00:19:51.867 "uuid": "1c510d82-5404-4d06-a0c2-9fc679bc931d", 00:19:51.867 "is_configured": true, 00:19:51.867 "data_offset": 2048, 00:19:51.867 "data_size": 63488 00:19:51.867 }, 00:19:51.867 { 00:19:51.867 "name": "BaseBdev3", 00:19:51.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.867 "is_configured": false, 00:19:51.867 "data_offset": 0, 00:19:51.867 "data_size": 0 00:19:51.867 }, 00:19:51.867 { 00:19:51.867 "name": "BaseBdev4", 00:19:51.867 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:51.867 "is_configured": false, 00:19:51.867 "data_offset": 0, 00:19:51.867 "data_size": 0 00:19:51.867 } 00:19:51.867 ] 00:19:51.867 }' 00:19:51.867 12:01:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.867 12:01:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:52.448 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:52.707 [2024-07-15 12:01:06.263174] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:52.707 BaseBdev3 00:19:52.707 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:52.707 12:01:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:19:52.707 12:01:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:52.707 12:01:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:52.707 12:01:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:52.707 12:01:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:52.707 12:01:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:52.966 12:01:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:53.224 [ 00:19:53.224 { 00:19:53.224 "name": "BaseBdev3", 00:19:53.224 "aliases": [ 00:19:53.224 "1523c97d-7363-4728-b539-d29eddb20acb" 00:19:53.224 ], 00:19:53.224 "product_name": "Malloc disk", 00:19:53.224 "block_size": 512, 00:19:53.224 "num_blocks": 65536, 00:19:53.224 "uuid": "1523c97d-7363-4728-b539-d29eddb20acb", 00:19:53.224 "assigned_rate_limits": { 00:19:53.224 "rw_ios_per_sec": 0, 00:19:53.224 "rw_mbytes_per_sec": 0, 00:19:53.224 "r_mbytes_per_sec": 0, 00:19:53.224 "w_mbytes_per_sec": 0 00:19:53.224 }, 00:19:53.224 "claimed": true, 00:19:53.224 "claim_type": "exclusive_write", 00:19:53.224 "zoned": false, 00:19:53.224 "supported_io_types": { 00:19:53.224 "read": true, 00:19:53.224 "write": true, 00:19:53.224 "unmap": true, 00:19:53.224 "flush": true, 00:19:53.224 "reset": true, 00:19:53.224 "nvme_admin": false, 00:19:53.224 "nvme_io": false, 00:19:53.224 "nvme_io_md": false, 00:19:53.224 "write_zeroes": true, 00:19:53.224 "zcopy": true, 00:19:53.224 "get_zone_info": false, 00:19:53.224 "zone_management": false, 00:19:53.224 "zone_append": false, 00:19:53.224 "compare": false, 00:19:53.224 "compare_and_write": false, 00:19:53.224 "abort": true, 00:19:53.224 "seek_hole": false, 00:19:53.224 "seek_data": false, 00:19:53.224 "copy": true, 00:19:53.224 "nvme_iov_md": false 00:19:53.224 }, 00:19:53.224 "memory_domains": [ 00:19:53.224 { 00:19:53.224 "dma_device_id": "system", 00:19:53.224 "dma_device_type": 1 00:19:53.224 }, 00:19:53.224 { 00:19:53.224 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:53.224 "dma_device_type": 2 00:19:53.224 } 00:19:53.224 ], 00:19:53.224 "driver_specific": {} 00:19:53.224 } 00:19:53.224 ] 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.224 12:01:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:53.483 12:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.483 "name": "Existed_Raid", 00:19:53.483 "uuid": "68809798-b6af-492c-be5b-5ecdcd02d37b", 00:19:53.483 "strip_size_kb": 64, 00:19:53.483 "state": "configuring", 00:19:53.483 "raid_level": "raid0", 00:19:53.483 "superblock": true, 00:19:53.483 "num_base_bdevs": 4, 00:19:53.483 "num_base_bdevs_discovered": 3, 00:19:53.483 "num_base_bdevs_operational": 4, 00:19:53.483 "base_bdevs_list": [ 00:19:53.483 { 00:19:53.483 "name": "BaseBdev1", 00:19:53.483 "uuid": "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b", 00:19:53.483 "is_configured": true, 00:19:53.483 "data_offset": 2048, 00:19:53.483 "data_size": 63488 00:19:53.483 }, 00:19:53.483 { 00:19:53.483 "name": "BaseBdev2", 00:19:53.483 "uuid": "1c510d82-5404-4d06-a0c2-9fc679bc931d", 00:19:53.483 "is_configured": true, 00:19:53.483 "data_offset": 2048, 00:19:53.483 "data_size": 63488 00:19:53.483 }, 00:19:53.483 { 00:19:53.483 "name": "BaseBdev3", 00:19:53.483 "uuid": "1523c97d-7363-4728-b539-d29eddb20acb", 00:19:53.483 "is_configured": true, 00:19:53.483 "data_offset": 2048, 00:19:53.483 "data_size": 63488 00:19:53.483 }, 00:19:53.483 { 00:19:53.483 "name": "BaseBdev4", 00:19:53.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:53.483 "is_configured": false, 00:19:53.483 "data_offset": 0, 00:19:53.483 "data_size": 0 00:19:53.483 } 00:19:53.483 ] 00:19:53.483 }' 00:19:53.483 12:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.483 12:01:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:54.419 12:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:54.419 [2024-07-15 12:01:07.939038] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:54.419 [2024-07-15 12:01:07.939215] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x242d4a0 00:19:54.419 [2024-07-15 12:01:07.939230] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:54.419 [2024-07-15 12:01:07.939412] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x242d0a0 00:19:54.419 [2024-07-15 12:01:07.939536] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x242d4a0 00:19:54.419 [2024-07-15 12:01:07.939546] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x242d4a0 00:19:54.419 [2024-07-15 12:01:07.939637] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:54.419 BaseBdev4 00:19:54.419 12:01:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:54.419 12:01:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:19:54.419 12:01:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:19:54.419 12:01:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:19:54.419 12:01:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:19:54.419 12:01:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:19:54.419 12:01:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:54.678 12:01:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:54.936 [ 00:19:54.936 { 00:19:54.936 "name": "BaseBdev4", 00:19:54.936 "aliases": [ 00:19:54.936 "e1367128-c88d-4f8e-8ba1-78bb2ff935db" 00:19:54.936 ], 00:19:54.936 "product_name": "Malloc disk", 00:19:54.936 "block_size": 512, 00:19:54.936 "num_blocks": 65536, 00:19:54.936 "uuid": "e1367128-c88d-4f8e-8ba1-78bb2ff935db", 00:19:54.936 "assigned_rate_limits": { 00:19:54.936 "rw_ios_per_sec": 0, 00:19:54.936 "rw_mbytes_per_sec": 0, 00:19:54.936 "r_mbytes_per_sec": 0, 00:19:54.936 "w_mbytes_per_sec": 0 00:19:54.936 }, 00:19:54.936 "claimed": true, 00:19:54.936 "claim_type": "exclusive_write", 00:19:54.936 "zoned": false, 00:19:54.936 "supported_io_types": { 00:19:54.936 "read": true, 00:19:54.936 "write": true, 00:19:54.936 "unmap": true, 00:19:54.936 "flush": true, 00:19:54.936 "reset": true, 00:19:54.936 "nvme_admin": false, 00:19:54.936 "nvme_io": false, 00:19:54.936 "nvme_io_md": false, 00:19:54.936 "write_zeroes": true, 00:19:54.936 "zcopy": true, 00:19:54.936 "get_zone_info": false, 00:19:54.936 "zone_management": false, 00:19:54.936 "zone_append": false, 00:19:54.936 "compare": false, 00:19:54.936 "compare_and_write": false, 00:19:54.936 "abort": true, 00:19:54.936 "seek_hole": false, 00:19:54.936 "seek_data": false, 00:19:54.936 "copy": true, 00:19:54.936 "nvme_iov_md": false 00:19:54.936 }, 00:19:54.936 "memory_domains": [ 00:19:54.936 { 00:19:54.936 "dma_device_id": "system", 00:19:54.936 "dma_device_type": 1 00:19:54.936 }, 00:19:54.936 { 00:19:54.936 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:54.936 "dma_device_type": 2 00:19:54.936 } 00:19:54.936 ], 00:19:54.936 "driver_specific": {} 00:19:54.936 } 00:19:54.936 ] 00:19:54.936 12:01:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:19:54.936 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:54.936 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.937 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:55.196 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:55.196 "name": "Existed_Raid", 00:19:55.196 "uuid": "68809798-b6af-492c-be5b-5ecdcd02d37b", 00:19:55.196 "strip_size_kb": 64, 00:19:55.196 "state": "online", 00:19:55.196 "raid_level": "raid0", 00:19:55.196 "superblock": true, 00:19:55.196 "num_base_bdevs": 4, 00:19:55.196 "num_base_bdevs_discovered": 4, 00:19:55.196 "num_base_bdevs_operational": 4, 00:19:55.196 "base_bdevs_list": [ 00:19:55.196 { 00:19:55.196 "name": "BaseBdev1", 00:19:55.196 "uuid": "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b", 00:19:55.196 "is_configured": true, 00:19:55.196 "data_offset": 2048, 00:19:55.196 "data_size": 63488 00:19:55.196 }, 00:19:55.196 { 00:19:55.196 "name": "BaseBdev2", 00:19:55.196 "uuid": "1c510d82-5404-4d06-a0c2-9fc679bc931d", 00:19:55.196 "is_configured": true, 00:19:55.196 "data_offset": 2048, 00:19:55.196 "data_size": 63488 00:19:55.196 }, 00:19:55.196 { 00:19:55.196 "name": "BaseBdev3", 00:19:55.196 "uuid": "1523c97d-7363-4728-b539-d29eddb20acb", 00:19:55.196 "is_configured": true, 00:19:55.196 "data_offset": 2048, 00:19:55.196 "data_size": 63488 00:19:55.196 }, 00:19:55.196 { 00:19:55.196 "name": "BaseBdev4", 00:19:55.196 "uuid": "e1367128-c88d-4f8e-8ba1-78bb2ff935db", 00:19:55.196 "is_configured": true, 00:19:55.196 "data_offset": 2048, 00:19:55.196 "data_size": 63488 00:19:55.196 } 00:19:55.196 ] 00:19:55.196 }' 00:19:55.196 12:01:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:55.196 12:01:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:56.133 12:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:56.133 12:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:56.133 12:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:56.133 12:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:56.133 12:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:56.133 12:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:56.133 12:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:56.133 12:01:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:56.702 [2024-07-15 12:01:10.113238] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:56.702 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:56.702 "name": "Existed_Raid", 00:19:56.702 "aliases": [ 00:19:56.702 "68809798-b6af-492c-be5b-5ecdcd02d37b" 00:19:56.702 ], 00:19:56.702 "product_name": "Raid Volume", 00:19:56.702 "block_size": 512, 00:19:56.702 "num_blocks": 253952, 00:19:56.702 "uuid": "68809798-b6af-492c-be5b-5ecdcd02d37b", 00:19:56.702 "assigned_rate_limits": { 00:19:56.702 "rw_ios_per_sec": 0, 00:19:56.702 "rw_mbytes_per_sec": 0, 00:19:56.702 "r_mbytes_per_sec": 0, 00:19:56.702 "w_mbytes_per_sec": 0 00:19:56.702 }, 00:19:56.702 "claimed": false, 00:19:56.702 "zoned": false, 00:19:56.702 "supported_io_types": { 00:19:56.702 "read": true, 00:19:56.702 "write": true, 00:19:56.702 "unmap": true, 00:19:56.702 "flush": true, 00:19:56.702 "reset": true, 00:19:56.702 "nvme_admin": false, 00:19:56.702 "nvme_io": false, 00:19:56.702 "nvme_io_md": false, 00:19:56.702 "write_zeroes": true, 00:19:56.702 "zcopy": false, 00:19:56.702 "get_zone_info": false, 00:19:56.702 "zone_management": false, 00:19:56.702 "zone_append": false, 00:19:56.702 "compare": false, 00:19:56.702 "compare_and_write": false, 00:19:56.702 "abort": false, 00:19:56.702 "seek_hole": false, 00:19:56.702 "seek_data": false, 00:19:56.702 "copy": false, 00:19:56.702 "nvme_iov_md": false 00:19:56.702 }, 00:19:56.702 "memory_domains": [ 00:19:56.702 { 00:19:56.702 "dma_device_id": "system", 00:19:56.702 "dma_device_type": 1 00:19:56.702 }, 00:19:56.703 { 00:19:56.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.703 "dma_device_type": 2 00:19:56.703 }, 00:19:56.703 { 00:19:56.703 "dma_device_id": "system", 00:19:56.703 "dma_device_type": 1 00:19:56.703 }, 00:19:56.703 { 00:19:56.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.703 "dma_device_type": 2 00:19:56.703 }, 00:19:56.703 { 00:19:56.703 "dma_device_id": "system", 00:19:56.703 "dma_device_type": 1 00:19:56.703 }, 00:19:56.703 { 00:19:56.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.703 "dma_device_type": 2 00:19:56.703 }, 00:19:56.703 { 00:19:56.703 "dma_device_id": "system", 00:19:56.703 "dma_device_type": 1 00:19:56.703 }, 00:19:56.703 { 00:19:56.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.703 "dma_device_type": 2 00:19:56.703 } 00:19:56.703 ], 00:19:56.703 "driver_specific": { 00:19:56.703 "raid": { 00:19:56.703 "uuid": "68809798-b6af-492c-be5b-5ecdcd02d37b", 00:19:56.703 "strip_size_kb": 64, 00:19:56.703 "state": "online", 00:19:56.703 "raid_level": "raid0", 00:19:56.703 "superblock": true, 00:19:56.703 "num_base_bdevs": 4, 00:19:56.703 "num_base_bdevs_discovered": 4, 00:19:56.703 "num_base_bdevs_operational": 4, 00:19:56.703 "base_bdevs_list": [ 00:19:56.703 { 00:19:56.703 "name": "BaseBdev1", 00:19:56.703 "uuid": "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b", 00:19:56.703 "is_configured": true, 00:19:56.703 "data_offset": 2048, 00:19:56.703 "data_size": 63488 00:19:56.703 }, 00:19:56.703 { 00:19:56.703 "name": "BaseBdev2", 00:19:56.703 "uuid": "1c510d82-5404-4d06-a0c2-9fc679bc931d", 00:19:56.703 "is_configured": true, 00:19:56.703 "data_offset": 2048, 00:19:56.703 "data_size": 63488 00:19:56.703 }, 00:19:56.703 { 00:19:56.703 "name": "BaseBdev3", 00:19:56.703 "uuid": "1523c97d-7363-4728-b539-d29eddb20acb", 00:19:56.703 "is_configured": true, 00:19:56.703 "data_offset": 2048, 00:19:56.703 "data_size": 63488 00:19:56.703 }, 00:19:56.703 { 00:19:56.703 "name": "BaseBdev4", 00:19:56.703 "uuid": "e1367128-c88d-4f8e-8ba1-78bb2ff935db", 00:19:56.703 "is_configured": true, 00:19:56.703 "data_offset": 2048, 00:19:56.703 "data_size": 63488 00:19:56.703 } 00:19:56.703 ] 00:19:56.703 } 00:19:56.703 } 00:19:56.703 }' 00:19:56.703 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:56.703 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:56.703 BaseBdev2 00:19:56.703 BaseBdev3 00:19:56.703 BaseBdev4' 00:19:56.703 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:56.703 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:56.703 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:56.962 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:56.962 "name": "BaseBdev1", 00:19:56.962 "aliases": [ 00:19:56.962 "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b" 00:19:56.962 ], 00:19:56.962 "product_name": "Malloc disk", 00:19:56.962 "block_size": 512, 00:19:56.962 "num_blocks": 65536, 00:19:56.962 "uuid": "3a2ee770-099f-48d6-b9a7-03ad2a63ae2b", 00:19:56.962 "assigned_rate_limits": { 00:19:56.962 "rw_ios_per_sec": 0, 00:19:56.962 "rw_mbytes_per_sec": 0, 00:19:56.962 "r_mbytes_per_sec": 0, 00:19:56.962 "w_mbytes_per_sec": 0 00:19:56.962 }, 00:19:56.962 "claimed": true, 00:19:56.962 "claim_type": "exclusive_write", 00:19:56.962 "zoned": false, 00:19:56.962 "supported_io_types": { 00:19:56.962 "read": true, 00:19:56.962 "write": true, 00:19:56.962 "unmap": true, 00:19:56.962 "flush": true, 00:19:56.962 "reset": true, 00:19:56.962 "nvme_admin": false, 00:19:56.962 "nvme_io": false, 00:19:56.962 "nvme_io_md": false, 00:19:56.962 "write_zeroes": true, 00:19:56.962 "zcopy": true, 00:19:56.962 "get_zone_info": false, 00:19:56.962 "zone_management": false, 00:19:56.962 "zone_append": false, 00:19:56.962 "compare": false, 00:19:56.962 "compare_and_write": false, 00:19:56.962 "abort": true, 00:19:56.962 "seek_hole": false, 00:19:56.962 "seek_data": false, 00:19:56.962 "copy": true, 00:19:56.962 "nvme_iov_md": false 00:19:56.962 }, 00:19:56.962 "memory_domains": [ 00:19:56.962 { 00:19:56.962 "dma_device_id": "system", 00:19:56.962 "dma_device_type": 1 00:19:56.962 }, 00:19:56.962 { 00:19:56.962 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.962 "dma_device_type": 2 00:19:56.962 } 00:19:56.962 ], 00:19:56.962 "driver_specific": {} 00:19:56.962 }' 00:19:56.962 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.962 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:56.962 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:56.962 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:57.221 12:01:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:57.481 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:57.481 "name": "BaseBdev2", 00:19:57.481 "aliases": [ 00:19:57.481 "1c510d82-5404-4d06-a0c2-9fc679bc931d" 00:19:57.481 ], 00:19:57.481 "product_name": "Malloc disk", 00:19:57.481 "block_size": 512, 00:19:57.481 "num_blocks": 65536, 00:19:57.481 "uuid": "1c510d82-5404-4d06-a0c2-9fc679bc931d", 00:19:57.481 "assigned_rate_limits": { 00:19:57.481 "rw_ios_per_sec": 0, 00:19:57.481 "rw_mbytes_per_sec": 0, 00:19:57.481 "r_mbytes_per_sec": 0, 00:19:57.481 "w_mbytes_per_sec": 0 00:19:57.481 }, 00:19:57.481 "claimed": true, 00:19:57.481 "claim_type": "exclusive_write", 00:19:57.481 "zoned": false, 00:19:57.481 "supported_io_types": { 00:19:57.481 "read": true, 00:19:57.481 "write": true, 00:19:57.481 "unmap": true, 00:19:57.481 "flush": true, 00:19:57.481 "reset": true, 00:19:57.481 "nvme_admin": false, 00:19:57.481 "nvme_io": false, 00:19:57.481 "nvme_io_md": false, 00:19:57.481 "write_zeroes": true, 00:19:57.481 "zcopy": true, 00:19:57.481 "get_zone_info": false, 00:19:57.481 "zone_management": false, 00:19:57.481 "zone_append": false, 00:19:57.481 "compare": false, 00:19:57.481 "compare_and_write": false, 00:19:57.481 "abort": true, 00:19:57.481 "seek_hole": false, 00:19:57.481 "seek_data": false, 00:19:57.481 "copy": true, 00:19:57.481 "nvme_iov_md": false 00:19:57.481 }, 00:19:57.481 "memory_domains": [ 00:19:57.481 { 00:19:57.481 "dma_device_id": "system", 00:19:57.481 "dma_device_type": 1 00:19:57.481 }, 00:19:57.481 { 00:19:57.481 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.481 "dma_device_type": 2 00:19:57.481 } 00:19:57.481 ], 00:19:57.481 "driver_specific": {} 00:19:57.481 }' 00:19:57.481 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.740 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:58.000 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:58.000 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:58.000 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:58.000 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:58.259 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:58.259 "name": "BaseBdev3", 00:19:58.259 "aliases": [ 00:19:58.259 "1523c97d-7363-4728-b539-d29eddb20acb" 00:19:58.259 ], 00:19:58.259 "product_name": "Malloc disk", 00:19:58.259 "block_size": 512, 00:19:58.259 "num_blocks": 65536, 00:19:58.259 "uuid": "1523c97d-7363-4728-b539-d29eddb20acb", 00:19:58.259 "assigned_rate_limits": { 00:19:58.259 "rw_ios_per_sec": 0, 00:19:58.259 "rw_mbytes_per_sec": 0, 00:19:58.259 "r_mbytes_per_sec": 0, 00:19:58.259 "w_mbytes_per_sec": 0 00:19:58.259 }, 00:19:58.259 "claimed": true, 00:19:58.259 "claim_type": "exclusive_write", 00:19:58.259 "zoned": false, 00:19:58.259 "supported_io_types": { 00:19:58.259 "read": true, 00:19:58.259 "write": true, 00:19:58.259 "unmap": true, 00:19:58.259 "flush": true, 00:19:58.259 "reset": true, 00:19:58.259 "nvme_admin": false, 00:19:58.259 "nvme_io": false, 00:19:58.259 "nvme_io_md": false, 00:19:58.259 "write_zeroes": true, 00:19:58.259 "zcopy": true, 00:19:58.259 "get_zone_info": false, 00:19:58.259 "zone_management": false, 00:19:58.259 "zone_append": false, 00:19:58.259 "compare": false, 00:19:58.259 "compare_and_write": false, 00:19:58.259 "abort": true, 00:19:58.259 "seek_hole": false, 00:19:58.259 "seek_data": false, 00:19:58.259 "copy": true, 00:19:58.259 "nvme_iov_md": false 00:19:58.259 }, 00:19:58.259 "memory_domains": [ 00:19:58.259 { 00:19:58.259 "dma_device_id": "system", 00:19:58.259 "dma_device_type": 1 00:19:58.259 }, 00:19:58.259 { 00:19:58.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:58.259 "dma_device_type": 2 00:19:58.259 } 00:19:58.259 ], 00:19:58.259 "driver_specific": {} 00:19:58.259 }' 00:19:58.259 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:58.259 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:58.259 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:58.259 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:58.259 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:58.518 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:58.518 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:58.518 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:58.518 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:58.518 12:01:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:58.518 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:58.518 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:58.518 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:58.518 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:58.518 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:58.778 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:58.778 "name": "BaseBdev4", 00:19:58.778 "aliases": [ 00:19:58.778 "e1367128-c88d-4f8e-8ba1-78bb2ff935db" 00:19:58.778 ], 00:19:58.778 "product_name": "Malloc disk", 00:19:58.778 "block_size": 512, 00:19:58.778 "num_blocks": 65536, 00:19:58.778 "uuid": "e1367128-c88d-4f8e-8ba1-78bb2ff935db", 00:19:58.778 "assigned_rate_limits": { 00:19:58.778 "rw_ios_per_sec": 0, 00:19:58.778 "rw_mbytes_per_sec": 0, 00:19:58.778 "r_mbytes_per_sec": 0, 00:19:58.778 "w_mbytes_per_sec": 0 00:19:58.778 }, 00:19:58.778 "claimed": true, 00:19:58.778 "claim_type": "exclusive_write", 00:19:58.778 "zoned": false, 00:19:58.778 "supported_io_types": { 00:19:58.778 "read": true, 00:19:58.778 "write": true, 00:19:58.778 "unmap": true, 00:19:58.778 "flush": true, 00:19:58.778 "reset": true, 00:19:58.778 "nvme_admin": false, 00:19:58.778 "nvme_io": false, 00:19:58.778 "nvme_io_md": false, 00:19:58.778 "write_zeroes": true, 00:19:58.778 "zcopy": true, 00:19:58.778 "get_zone_info": false, 00:19:58.778 "zone_management": false, 00:19:58.778 "zone_append": false, 00:19:58.778 "compare": false, 00:19:58.778 "compare_and_write": false, 00:19:58.778 "abort": true, 00:19:58.778 "seek_hole": false, 00:19:58.778 "seek_data": false, 00:19:58.778 "copy": true, 00:19:58.778 "nvme_iov_md": false 00:19:58.778 }, 00:19:58.778 "memory_domains": [ 00:19:58.778 { 00:19:58.778 "dma_device_id": "system", 00:19:58.778 "dma_device_type": 1 00:19:58.778 }, 00:19:58.778 { 00:19:58.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:58.778 "dma_device_type": 2 00:19:58.778 } 00:19:58.778 ], 00:19:58.778 "driver_specific": {} 00:19:58.778 }' 00:19:58.778 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:59.037 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:59.037 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:59.037 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:59.037 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:59.037 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:59.037 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:59.296 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:59.296 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:59.296 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:59.296 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:59.296 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:59.296 12:01:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:59.555 [2024-07-15 12:01:13.044707] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:59.555 [2024-07-15 12:01:13.044738] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:59.555 [2024-07-15 12:01:13.044786] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.555 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:59.815 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:59.815 "name": "Existed_Raid", 00:19:59.815 "uuid": "68809798-b6af-492c-be5b-5ecdcd02d37b", 00:19:59.815 "strip_size_kb": 64, 00:19:59.815 "state": "offline", 00:19:59.815 "raid_level": "raid0", 00:19:59.815 "superblock": true, 00:19:59.815 "num_base_bdevs": 4, 00:19:59.815 "num_base_bdevs_discovered": 3, 00:19:59.815 "num_base_bdevs_operational": 3, 00:19:59.815 "base_bdevs_list": [ 00:19:59.815 { 00:19:59.815 "name": null, 00:19:59.815 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:59.815 "is_configured": false, 00:19:59.815 "data_offset": 2048, 00:19:59.815 "data_size": 63488 00:19:59.815 }, 00:19:59.815 { 00:19:59.815 "name": "BaseBdev2", 00:19:59.815 "uuid": "1c510d82-5404-4d06-a0c2-9fc679bc931d", 00:19:59.815 "is_configured": true, 00:19:59.815 "data_offset": 2048, 00:19:59.815 "data_size": 63488 00:19:59.815 }, 00:19:59.815 { 00:19:59.815 "name": "BaseBdev3", 00:19:59.815 "uuid": "1523c97d-7363-4728-b539-d29eddb20acb", 00:19:59.815 "is_configured": true, 00:19:59.815 "data_offset": 2048, 00:19:59.815 "data_size": 63488 00:19:59.815 }, 00:19:59.815 { 00:19:59.815 "name": "BaseBdev4", 00:19:59.815 "uuid": "e1367128-c88d-4f8e-8ba1-78bb2ff935db", 00:19:59.815 "is_configured": true, 00:19:59.815 "data_offset": 2048, 00:19:59.815 "data_size": 63488 00:19:59.815 } 00:19:59.815 ] 00:19:59.815 }' 00:19:59.815 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:59.815 12:01:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:00.383 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:00.383 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:00.383 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.383 12:01:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:00.641 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:00.641 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:00.642 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:00.642 [2024-07-15 12:01:14.236996] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:00.900 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:00.900 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:00.900 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.900 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:01.159 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:01.160 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:01.160 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:01.160 [2024-07-15 12:01:14.749074] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:01.418 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:01.418 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:01.418 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.418 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:01.418 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:01.418 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:01.418 12:01:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:01.676 [2024-07-15 12:01:15.126399] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:01.676 [2024-07-15 12:01:15.126438] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x242d4a0 name Existed_Raid, state offline 00:20:01.676 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:01.676 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:01.676 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.676 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:01.935 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:01.935 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:01.935 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:01.935 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:01.935 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:01.935 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:02.194 BaseBdev2 00:20:02.194 12:01:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:02.194 12:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:02.194 12:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:02.194 12:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:02.194 12:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:02.194 12:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:02.194 12:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:02.453 12:01:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:02.712 [ 00:20:02.712 { 00:20:02.712 "name": "BaseBdev2", 00:20:02.712 "aliases": [ 00:20:02.712 "ec45357c-98f0-4618-bf8d-e91369776704" 00:20:02.712 ], 00:20:02.712 "product_name": "Malloc disk", 00:20:02.712 "block_size": 512, 00:20:02.712 "num_blocks": 65536, 00:20:02.712 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:02.712 "assigned_rate_limits": { 00:20:02.712 "rw_ios_per_sec": 0, 00:20:02.712 "rw_mbytes_per_sec": 0, 00:20:02.712 "r_mbytes_per_sec": 0, 00:20:02.712 "w_mbytes_per_sec": 0 00:20:02.712 }, 00:20:02.712 "claimed": false, 00:20:02.712 "zoned": false, 00:20:02.712 "supported_io_types": { 00:20:02.712 "read": true, 00:20:02.712 "write": true, 00:20:02.712 "unmap": true, 00:20:02.712 "flush": true, 00:20:02.712 "reset": true, 00:20:02.712 "nvme_admin": false, 00:20:02.712 "nvme_io": false, 00:20:02.712 "nvme_io_md": false, 00:20:02.712 "write_zeroes": true, 00:20:02.712 "zcopy": true, 00:20:02.712 "get_zone_info": false, 00:20:02.712 "zone_management": false, 00:20:02.712 "zone_append": false, 00:20:02.712 "compare": false, 00:20:02.712 "compare_and_write": false, 00:20:02.712 "abort": true, 00:20:02.712 "seek_hole": false, 00:20:02.712 "seek_data": false, 00:20:02.712 "copy": true, 00:20:02.712 "nvme_iov_md": false 00:20:02.712 }, 00:20:02.712 "memory_domains": [ 00:20:02.712 { 00:20:02.712 "dma_device_id": "system", 00:20:02.712 "dma_device_type": 1 00:20:02.712 }, 00:20:02.712 { 00:20:02.712 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:02.712 "dma_device_type": 2 00:20:02.712 } 00:20:02.712 ], 00:20:02.712 "driver_specific": {} 00:20:02.712 } 00:20:02.712 ] 00:20:02.712 12:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:02.712 12:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:02.712 12:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:02.712 12:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:02.971 BaseBdev3 00:20:02.971 12:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:02.971 12:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:02.971 12:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:02.971 12:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:02.971 12:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:02.971 12:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:02.971 12:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:03.230 12:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:03.489 [ 00:20:03.489 { 00:20:03.489 "name": "BaseBdev3", 00:20:03.489 "aliases": [ 00:20:03.489 "931c2558-0bef-4c46-afab-f70c13c9d4fc" 00:20:03.489 ], 00:20:03.489 "product_name": "Malloc disk", 00:20:03.489 "block_size": 512, 00:20:03.489 "num_blocks": 65536, 00:20:03.489 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:03.489 "assigned_rate_limits": { 00:20:03.489 "rw_ios_per_sec": 0, 00:20:03.489 "rw_mbytes_per_sec": 0, 00:20:03.489 "r_mbytes_per_sec": 0, 00:20:03.489 "w_mbytes_per_sec": 0 00:20:03.489 }, 00:20:03.489 "claimed": false, 00:20:03.489 "zoned": false, 00:20:03.489 "supported_io_types": { 00:20:03.489 "read": true, 00:20:03.489 "write": true, 00:20:03.489 "unmap": true, 00:20:03.489 "flush": true, 00:20:03.489 "reset": true, 00:20:03.489 "nvme_admin": false, 00:20:03.489 "nvme_io": false, 00:20:03.489 "nvme_io_md": false, 00:20:03.489 "write_zeroes": true, 00:20:03.489 "zcopy": true, 00:20:03.489 "get_zone_info": false, 00:20:03.489 "zone_management": false, 00:20:03.489 "zone_append": false, 00:20:03.489 "compare": false, 00:20:03.489 "compare_and_write": false, 00:20:03.489 "abort": true, 00:20:03.489 "seek_hole": false, 00:20:03.489 "seek_data": false, 00:20:03.489 "copy": true, 00:20:03.489 "nvme_iov_md": false 00:20:03.489 }, 00:20:03.489 "memory_domains": [ 00:20:03.489 { 00:20:03.489 "dma_device_id": "system", 00:20:03.489 "dma_device_type": 1 00:20:03.489 }, 00:20:03.489 { 00:20:03.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.489 "dma_device_type": 2 00:20:03.489 } 00:20:03.489 ], 00:20:03.489 "driver_specific": {} 00:20:03.489 } 00:20:03.489 ] 00:20:03.489 12:01:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:03.489 12:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:03.489 12:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:03.489 12:01:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:03.749 BaseBdev4 00:20:03.749 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:03.749 12:01:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:20:03.749 12:01:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:03.749 12:01:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:03.749 12:01:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:03.749 12:01:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:03.749 12:01:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:04.008 12:01:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:04.268 [ 00:20:04.268 { 00:20:04.268 "name": "BaseBdev4", 00:20:04.268 "aliases": [ 00:20:04.268 "b21f87e3-94cc-48e6-8dce-eabc4a3723c8" 00:20:04.268 ], 00:20:04.268 "product_name": "Malloc disk", 00:20:04.268 "block_size": 512, 00:20:04.268 "num_blocks": 65536, 00:20:04.268 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:04.268 "assigned_rate_limits": { 00:20:04.268 "rw_ios_per_sec": 0, 00:20:04.268 "rw_mbytes_per_sec": 0, 00:20:04.268 "r_mbytes_per_sec": 0, 00:20:04.268 "w_mbytes_per_sec": 0 00:20:04.268 }, 00:20:04.268 "claimed": false, 00:20:04.268 "zoned": false, 00:20:04.268 "supported_io_types": { 00:20:04.268 "read": true, 00:20:04.268 "write": true, 00:20:04.268 "unmap": true, 00:20:04.268 "flush": true, 00:20:04.268 "reset": true, 00:20:04.268 "nvme_admin": false, 00:20:04.268 "nvme_io": false, 00:20:04.268 "nvme_io_md": false, 00:20:04.268 "write_zeroes": true, 00:20:04.268 "zcopy": true, 00:20:04.268 "get_zone_info": false, 00:20:04.268 "zone_management": false, 00:20:04.268 "zone_append": false, 00:20:04.268 "compare": false, 00:20:04.268 "compare_and_write": false, 00:20:04.268 "abort": true, 00:20:04.268 "seek_hole": false, 00:20:04.268 "seek_data": false, 00:20:04.268 "copy": true, 00:20:04.268 "nvme_iov_md": false 00:20:04.268 }, 00:20:04.268 "memory_domains": [ 00:20:04.268 { 00:20:04.268 "dma_device_id": "system", 00:20:04.268 "dma_device_type": 1 00:20:04.268 }, 00:20:04.268 { 00:20:04.268 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.268 "dma_device_type": 2 00:20:04.268 } 00:20:04.268 ], 00:20:04.268 "driver_specific": {} 00:20:04.268 } 00:20:04.268 ] 00:20:04.268 12:01:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:04.268 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:04.268 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:04.268 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:04.268 [2024-07-15 12:01:17.854139] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:04.268 [2024-07-15 12:01:17.854184] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:04.268 [2024-07-15 12:01:17.854203] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:04.268 [2024-07-15 12:01:17.855569] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:04.268 [2024-07-15 12:01:17.855614] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.528 12:01:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:04.528 12:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.528 "name": "Existed_Raid", 00:20:04.528 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:04.528 "strip_size_kb": 64, 00:20:04.528 "state": "configuring", 00:20:04.528 "raid_level": "raid0", 00:20:04.528 "superblock": true, 00:20:04.528 "num_base_bdevs": 4, 00:20:04.528 "num_base_bdevs_discovered": 3, 00:20:04.528 "num_base_bdevs_operational": 4, 00:20:04.528 "base_bdevs_list": [ 00:20:04.528 { 00:20:04.528 "name": "BaseBdev1", 00:20:04.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.528 "is_configured": false, 00:20:04.528 "data_offset": 0, 00:20:04.528 "data_size": 0 00:20:04.528 }, 00:20:04.528 { 00:20:04.528 "name": "BaseBdev2", 00:20:04.528 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:04.528 "is_configured": true, 00:20:04.528 "data_offset": 2048, 00:20:04.528 "data_size": 63488 00:20:04.528 }, 00:20:04.528 { 00:20:04.528 "name": "BaseBdev3", 00:20:04.528 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:04.528 "is_configured": true, 00:20:04.528 "data_offset": 2048, 00:20:04.528 "data_size": 63488 00:20:04.528 }, 00:20:04.528 { 00:20:04.528 "name": "BaseBdev4", 00:20:04.528 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:04.528 "is_configured": true, 00:20:04.528 "data_offset": 2048, 00:20:04.528 "data_size": 63488 00:20:04.528 } 00:20:04.528 ] 00:20:04.528 }' 00:20:04.528 12:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.528 12:01:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:05.464 12:01:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:05.464 [2024-07-15 12:01:19.009155] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:05.464 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:05.464 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:05.464 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:05.464 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:05.465 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:05.465 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:05.465 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:05.465 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:05.465 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:05.465 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:05.465 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.465 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:05.724 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:05.724 "name": "Existed_Raid", 00:20:05.724 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:05.724 "strip_size_kb": 64, 00:20:05.724 "state": "configuring", 00:20:05.724 "raid_level": "raid0", 00:20:05.724 "superblock": true, 00:20:05.724 "num_base_bdevs": 4, 00:20:05.724 "num_base_bdevs_discovered": 2, 00:20:05.724 "num_base_bdevs_operational": 4, 00:20:05.724 "base_bdevs_list": [ 00:20:05.724 { 00:20:05.724 "name": "BaseBdev1", 00:20:05.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.724 "is_configured": false, 00:20:05.724 "data_offset": 0, 00:20:05.724 "data_size": 0 00:20:05.724 }, 00:20:05.724 { 00:20:05.724 "name": null, 00:20:05.724 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:05.724 "is_configured": false, 00:20:05.724 "data_offset": 2048, 00:20:05.724 "data_size": 63488 00:20:05.724 }, 00:20:05.724 { 00:20:05.724 "name": "BaseBdev3", 00:20:05.724 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:05.724 "is_configured": true, 00:20:05.724 "data_offset": 2048, 00:20:05.724 "data_size": 63488 00:20:05.724 }, 00:20:05.724 { 00:20:05.724 "name": "BaseBdev4", 00:20:05.724 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:05.724 "is_configured": true, 00:20:05.724 "data_offset": 2048, 00:20:05.724 "data_size": 63488 00:20:05.724 } 00:20:05.724 ] 00:20:05.724 }' 00:20:05.724 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:05.724 12:01:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:06.293 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.293 12:01:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:06.551 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:06.551 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:06.811 [2024-07-15 12:01:20.352095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:06.811 BaseBdev1 00:20:06.811 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:06.811 12:01:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:06.811 12:01:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:06.811 12:01:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:06.811 12:01:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:06.811 12:01:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:06.811 12:01:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:07.069 12:01:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:07.328 [ 00:20:07.328 { 00:20:07.328 "name": "BaseBdev1", 00:20:07.328 "aliases": [ 00:20:07.328 "70979a0a-e479-4e71-b398-751aa6458a43" 00:20:07.328 ], 00:20:07.328 "product_name": "Malloc disk", 00:20:07.328 "block_size": 512, 00:20:07.328 "num_blocks": 65536, 00:20:07.328 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:07.328 "assigned_rate_limits": { 00:20:07.328 "rw_ios_per_sec": 0, 00:20:07.328 "rw_mbytes_per_sec": 0, 00:20:07.328 "r_mbytes_per_sec": 0, 00:20:07.328 "w_mbytes_per_sec": 0 00:20:07.328 }, 00:20:07.328 "claimed": true, 00:20:07.328 "claim_type": "exclusive_write", 00:20:07.328 "zoned": false, 00:20:07.328 "supported_io_types": { 00:20:07.328 "read": true, 00:20:07.328 "write": true, 00:20:07.328 "unmap": true, 00:20:07.328 "flush": true, 00:20:07.328 "reset": true, 00:20:07.328 "nvme_admin": false, 00:20:07.328 "nvme_io": false, 00:20:07.328 "nvme_io_md": false, 00:20:07.328 "write_zeroes": true, 00:20:07.328 "zcopy": true, 00:20:07.328 "get_zone_info": false, 00:20:07.328 "zone_management": false, 00:20:07.328 "zone_append": false, 00:20:07.328 "compare": false, 00:20:07.328 "compare_and_write": false, 00:20:07.328 "abort": true, 00:20:07.328 "seek_hole": false, 00:20:07.328 "seek_data": false, 00:20:07.328 "copy": true, 00:20:07.328 "nvme_iov_md": false 00:20:07.328 }, 00:20:07.328 "memory_domains": [ 00:20:07.328 { 00:20:07.328 "dma_device_id": "system", 00:20:07.328 "dma_device_type": 1 00:20:07.328 }, 00:20:07.328 { 00:20:07.328 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.328 "dma_device_type": 2 00:20:07.328 } 00:20:07.328 ], 00:20:07.328 "driver_specific": {} 00:20:07.328 } 00:20:07.328 ] 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.328 12:01:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.586 12:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.586 "name": "Existed_Raid", 00:20:07.586 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:07.586 "strip_size_kb": 64, 00:20:07.586 "state": "configuring", 00:20:07.586 "raid_level": "raid0", 00:20:07.586 "superblock": true, 00:20:07.586 "num_base_bdevs": 4, 00:20:07.586 "num_base_bdevs_discovered": 3, 00:20:07.586 "num_base_bdevs_operational": 4, 00:20:07.586 "base_bdevs_list": [ 00:20:07.586 { 00:20:07.586 "name": "BaseBdev1", 00:20:07.586 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:07.586 "is_configured": true, 00:20:07.586 "data_offset": 2048, 00:20:07.586 "data_size": 63488 00:20:07.586 }, 00:20:07.586 { 00:20:07.586 "name": null, 00:20:07.586 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:07.586 "is_configured": false, 00:20:07.586 "data_offset": 2048, 00:20:07.586 "data_size": 63488 00:20:07.586 }, 00:20:07.586 { 00:20:07.586 "name": "BaseBdev3", 00:20:07.586 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:07.586 "is_configured": true, 00:20:07.586 "data_offset": 2048, 00:20:07.586 "data_size": 63488 00:20:07.586 }, 00:20:07.586 { 00:20:07.586 "name": "BaseBdev4", 00:20:07.586 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:07.586 "is_configured": true, 00:20:07.586 "data_offset": 2048, 00:20:07.586 "data_size": 63488 00:20:07.586 } 00:20:07.586 ] 00:20:07.586 }' 00:20:07.586 12:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.586 12:01:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:08.153 12:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.153 12:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:08.412 12:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:08.413 12:01:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:08.671 [2024-07-15 12:01:22.096751] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.671 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:08.930 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.930 "name": "Existed_Raid", 00:20:08.930 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:08.930 "strip_size_kb": 64, 00:20:08.930 "state": "configuring", 00:20:08.930 "raid_level": "raid0", 00:20:08.930 "superblock": true, 00:20:08.930 "num_base_bdevs": 4, 00:20:08.930 "num_base_bdevs_discovered": 2, 00:20:08.930 "num_base_bdevs_operational": 4, 00:20:08.930 "base_bdevs_list": [ 00:20:08.930 { 00:20:08.930 "name": "BaseBdev1", 00:20:08.930 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:08.930 "is_configured": true, 00:20:08.930 "data_offset": 2048, 00:20:08.930 "data_size": 63488 00:20:08.930 }, 00:20:08.930 { 00:20:08.930 "name": null, 00:20:08.930 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:08.930 "is_configured": false, 00:20:08.930 "data_offset": 2048, 00:20:08.930 "data_size": 63488 00:20:08.930 }, 00:20:08.930 { 00:20:08.930 "name": null, 00:20:08.930 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:08.930 "is_configured": false, 00:20:08.930 "data_offset": 2048, 00:20:08.930 "data_size": 63488 00:20:08.930 }, 00:20:08.930 { 00:20:08.930 "name": "BaseBdev4", 00:20:08.930 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:08.930 "is_configured": true, 00:20:08.930 "data_offset": 2048, 00:20:08.930 "data_size": 63488 00:20:08.930 } 00:20:08.930 ] 00:20:08.930 }' 00:20:08.930 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.930 12:01:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:09.498 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:09.498 12:01:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:09.757 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:09.757 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:10.015 [2024-07-15 12:01:23.468402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:10.015 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.016 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:10.275 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.275 "name": "Existed_Raid", 00:20:10.275 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:10.275 "strip_size_kb": 64, 00:20:10.275 "state": "configuring", 00:20:10.275 "raid_level": "raid0", 00:20:10.275 "superblock": true, 00:20:10.275 "num_base_bdevs": 4, 00:20:10.275 "num_base_bdevs_discovered": 3, 00:20:10.275 "num_base_bdevs_operational": 4, 00:20:10.275 "base_bdevs_list": [ 00:20:10.275 { 00:20:10.275 "name": "BaseBdev1", 00:20:10.275 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:10.275 "is_configured": true, 00:20:10.275 "data_offset": 2048, 00:20:10.275 "data_size": 63488 00:20:10.275 }, 00:20:10.275 { 00:20:10.275 "name": null, 00:20:10.275 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:10.275 "is_configured": false, 00:20:10.275 "data_offset": 2048, 00:20:10.275 "data_size": 63488 00:20:10.275 }, 00:20:10.275 { 00:20:10.275 "name": "BaseBdev3", 00:20:10.275 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:10.275 "is_configured": true, 00:20:10.275 "data_offset": 2048, 00:20:10.275 "data_size": 63488 00:20:10.275 }, 00:20:10.275 { 00:20:10.275 "name": "BaseBdev4", 00:20:10.275 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:10.275 "is_configured": true, 00:20:10.275 "data_offset": 2048, 00:20:10.275 "data_size": 63488 00:20:10.275 } 00:20:10.275 ] 00:20:10.275 }' 00:20:10.275 12:01:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.275 12:01:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:10.842 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.842 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:11.102 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:11.102 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:11.361 [2024-07-15 12:01:24.735778] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:11.361 12:01:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:11.621 12:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:11.621 "name": "Existed_Raid", 00:20:11.621 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:11.621 "strip_size_kb": 64, 00:20:11.621 "state": "configuring", 00:20:11.621 "raid_level": "raid0", 00:20:11.621 "superblock": true, 00:20:11.621 "num_base_bdevs": 4, 00:20:11.621 "num_base_bdevs_discovered": 2, 00:20:11.621 "num_base_bdevs_operational": 4, 00:20:11.621 "base_bdevs_list": [ 00:20:11.621 { 00:20:11.621 "name": null, 00:20:11.621 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:11.621 "is_configured": false, 00:20:11.621 "data_offset": 2048, 00:20:11.621 "data_size": 63488 00:20:11.621 }, 00:20:11.621 { 00:20:11.621 "name": null, 00:20:11.621 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:11.621 "is_configured": false, 00:20:11.621 "data_offset": 2048, 00:20:11.621 "data_size": 63488 00:20:11.621 }, 00:20:11.621 { 00:20:11.621 "name": "BaseBdev3", 00:20:11.621 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:11.621 "is_configured": true, 00:20:11.621 "data_offset": 2048, 00:20:11.621 "data_size": 63488 00:20:11.621 }, 00:20:11.621 { 00:20:11.621 "name": "BaseBdev4", 00:20:11.621 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:11.621 "is_configured": true, 00:20:11.621 "data_offset": 2048, 00:20:11.621 "data_size": 63488 00:20:11.621 } 00:20:11.621 ] 00:20:11.621 }' 00:20:11.621 12:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:11.621 12:01:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:12.189 12:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.189 12:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:12.448 12:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:12.448 12:01:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:12.708 [2024-07-15 12:01:26.114059] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:12.708 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:12.967 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:12.967 "name": "Existed_Raid", 00:20:12.967 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:12.967 "strip_size_kb": 64, 00:20:12.967 "state": "configuring", 00:20:12.967 "raid_level": "raid0", 00:20:12.967 "superblock": true, 00:20:12.967 "num_base_bdevs": 4, 00:20:12.967 "num_base_bdevs_discovered": 3, 00:20:12.967 "num_base_bdevs_operational": 4, 00:20:12.967 "base_bdevs_list": [ 00:20:12.967 { 00:20:12.967 "name": null, 00:20:12.967 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:12.967 "is_configured": false, 00:20:12.967 "data_offset": 2048, 00:20:12.967 "data_size": 63488 00:20:12.967 }, 00:20:12.967 { 00:20:12.967 "name": "BaseBdev2", 00:20:12.967 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:12.967 "is_configured": true, 00:20:12.967 "data_offset": 2048, 00:20:12.967 "data_size": 63488 00:20:12.967 }, 00:20:12.967 { 00:20:12.967 "name": "BaseBdev3", 00:20:12.967 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:12.967 "is_configured": true, 00:20:12.967 "data_offset": 2048, 00:20:12.967 "data_size": 63488 00:20:12.967 }, 00:20:12.967 { 00:20:12.967 "name": "BaseBdev4", 00:20:12.967 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:12.967 "is_configured": true, 00:20:12.967 "data_offset": 2048, 00:20:12.967 "data_size": 63488 00:20:12.967 } 00:20:12.967 ] 00:20:12.967 }' 00:20:12.967 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:12.967 12:01:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:13.535 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.535 12:01:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:13.793 12:01:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:13.793 12:01:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.793 12:01:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:14.052 12:01:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 70979a0a-e479-4e71-b398-751aa6458a43 00:20:14.311 [2024-07-15 12:01:27.698440] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:14.311 [2024-07-15 12:01:27.698609] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x242e120 00:20:14.311 [2024-07-15 12:01:27.698622] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:14.311 [2024-07-15 12:01:27.698811] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x242caf0 00:20:14.311 [2024-07-15 12:01:27.698929] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x242e120 00:20:14.311 [2024-07-15 12:01:27.698939] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x242e120 00:20:14.311 [2024-07-15 12:01:27.699037] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:14.311 NewBaseBdev 00:20:14.311 12:01:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:14.311 12:01:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:20:14.311 12:01:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:14.311 12:01:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:20:14.311 12:01:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:14.311 12:01:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:14.311 12:01:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:14.569 12:01:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:14.828 [ 00:20:14.828 { 00:20:14.828 "name": "NewBaseBdev", 00:20:14.828 "aliases": [ 00:20:14.828 "70979a0a-e479-4e71-b398-751aa6458a43" 00:20:14.828 ], 00:20:14.828 "product_name": "Malloc disk", 00:20:14.828 "block_size": 512, 00:20:14.828 "num_blocks": 65536, 00:20:14.828 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:14.828 "assigned_rate_limits": { 00:20:14.828 "rw_ios_per_sec": 0, 00:20:14.828 "rw_mbytes_per_sec": 0, 00:20:14.828 "r_mbytes_per_sec": 0, 00:20:14.828 "w_mbytes_per_sec": 0 00:20:14.828 }, 00:20:14.828 "claimed": true, 00:20:14.828 "claim_type": "exclusive_write", 00:20:14.828 "zoned": false, 00:20:14.828 "supported_io_types": { 00:20:14.828 "read": true, 00:20:14.828 "write": true, 00:20:14.828 "unmap": true, 00:20:14.828 "flush": true, 00:20:14.828 "reset": true, 00:20:14.828 "nvme_admin": false, 00:20:14.828 "nvme_io": false, 00:20:14.828 "nvme_io_md": false, 00:20:14.828 "write_zeroes": true, 00:20:14.828 "zcopy": true, 00:20:14.828 "get_zone_info": false, 00:20:14.828 "zone_management": false, 00:20:14.828 "zone_append": false, 00:20:14.828 "compare": false, 00:20:14.828 "compare_and_write": false, 00:20:14.828 "abort": true, 00:20:14.828 "seek_hole": false, 00:20:14.828 "seek_data": false, 00:20:14.828 "copy": true, 00:20:14.828 "nvme_iov_md": false 00:20:14.828 }, 00:20:14.828 "memory_domains": [ 00:20:14.828 { 00:20:14.828 "dma_device_id": "system", 00:20:14.828 "dma_device_type": 1 00:20:14.828 }, 00:20:14.828 { 00:20:14.828 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:14.828 "dma_device_type": 2 00:20:14.828 } 00:20:14.828 ], 00:20:14.828 "driver_specific": {} 00:20:14.828 } 00:20:14.828 ] 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:14.828 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:15.087 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.087 "name": "Existed_Raid", 00:20:15.087 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:15.087 "strip_size_kb": 64, 00:20:15.087 "state": "online", 00:20:15.087 "raid_level": "raid0", 00:20:15.087 "superblock": true, 00:20:15.087 "num_base_bdevs": 4, 00:20:15.087 "num_base_bdevs_discovered": 4, 00:20:15.087 "num_base_bdevs_operational": 4, 00:20:15.087 "base_bdevs_list": [ 00:20:15.087 { 00:20:15.087 "name": "NewBaseBdev", 00:20:15.087 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:15.087 "is_configured": true, 00:20:15.087 "data_offset": 2048, 00:20:15.087 "data_size": 63488 00:20:15.087 }, 00:20:15.087 { 00:20:15.087 "name": "BaseBdev2", 00:20:15.087 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:15.087 "is_configured": true, 00:20:15.087 "data_offset": 2048, 00:20:15.087 "data_size": 63488 00:20:15.087 }, 00:20:15.087 { 00:20:15.087 "name": "BaseBdev3", 00:20:15.087 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:15.087 "is_configured": true, 00:20:15.087 "data_offset": 2048, 00:20:15.087 "data_size": 63488 00:20:15.087 }, 00:20:15.087 { 00:20:15.087 "name": "BaseBdev4", 00:20:15.087 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:15.087 "is_configured": true, 00:20:15.087 "data_offset": 2048, 00:20:15.087 "data_size": 63488 00:20:15.087 } 00:20:15.087 ] 00:20:15.087 }' 00:20:15.087 12:01:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.087 12:01:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:15.656 [2024-07-15 12:01:29.210801] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:15.656 "name": "Existed_Raid", 00:20:15.656 "aliases": [ 00:20:15.656 "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4" 00:20:15.656 ], 00:20:15.656 "product_name": "Raid Volume", 00:20:15.656 "block_size": 512, 00:20:15.656 "num_blocks": 253952, 00:20:15.656 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:15.656 "assigned_rate_limits": { 00:20:15.656 "rw_ios_per_sec": 0, 00:20:15.656 "rw_mbytes_per_sec": 0, 00:20:15.656 "r_mbytes_per_sec": 0, 00:20:15.656 "w_mbytes_per_sec": 0 00:20:15.656 }, 00:20:15.656 "claimed": false, 00:20:15.656 "zoned": false, 00:20:15.656 "supported_io_types": { 00:20:15.656 "read": true, 00:20:15.656 "write": true, 00:20:15.656 "unmap": true, 00:20:15.656 "flush": true, 00:20:15.656 "reset": true, 00:20:15.656 "nvme_admin": false, 00:20:15.656 "nvme_io": false, 00:20:15.656 "nvme_io_md": false, 00:20:15.656 "write_zeroes": true, 00:20:15.656 "zcopy": false, 00:20:15.656 "get_zone_info": false, 00:20:15.656 "zone_management": false, 00:20:15.656 "zone_append": false, 00:20:15.656 "compare": false, 00:20:15.656 "compare_and_write": false, 00:20:15.656 "abort": false, 00:20:15.656 "seek_hole": false, 00:20:15.656 "seek_data": false, 00:20:15.656 "copy": false, 00:20:15.656 "nvme_iov_md": false 00:20:15.656 }, 00:20:15.656 "memory_domains": [ 00:20:15.656 { 00:20:15.656 "dma_device_id": "system", 00:20:15.656 "dma_device_type": 1 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.656 "dma_device_type": 2 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "dma_device_id": "system", 00:20:15.656 "dma_device_type": 1 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.656 "dma_device_type": 2 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "dma_device_id": "system", 00:20:15.656 "dma_device_type": 1 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.656 "dma_device_type": 2 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "dma_device_id": "system", 00:20:15.656 "dma_device_type": 1 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.656 "dma_device_type": 2 00:20:15.656 } 00:20:15.656 ], 00:20:15.656 "driver_specific": { 00:20:15.656 "raid": { 00:20:15.656 "uuid": "9cb0f036-f7fb-42a7-b22d-c7a1842e05e4", 00:20:15.656 "strip_size_kb": 64, 00:20:15.656 "state": "online", 00:20:15.656 "raid_level": "raid0", 00:20:15.656 "superblock": true, 00:20:15.656 "num_base_bdevs": 4, 00:20:15.656 "num_base_bdevs_discovered": 4, 00:20:15.656 "num_base_bdevs_operational": 4, 00:20:15.656 "base_bdevs_list": [ 00:20:15.656 { 00:20:15.656 "name": "NewBaseBdev", 00:20:15.656 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:15.656 "is_configured": true, 00:20:15.656 "data_offset": 2048, 00:20:15.656 "data_size": 63488 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "name": "BaseBdev2", 00:20:15.656 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:15.656 "is_configured": true, 00:20:15.656 "data_offset": 2048, 00:20:15.656 "data_size": 63488 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "name": "BaseBdev3", 00:20:15.656 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:15.656 "is_configured": true, 00:20:15.656 "data_offset": 2048, 00:20:15.656 "data_size": 63488 00:20:15.656 }, 00:20:15.656 { 00:20:15.656 "name": "BaseBdev4", 00:20:15.656 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:15.656 "is_configured": true, 00:20:15.656 "data_offset": 2048, 00:20:15.656 "data_size": 63488 00:20:15.656 } 00:20:15.656 ] 00:20:15.656 } 00:20:15.656 } 00:20:15.656 }' 00:20:15.656 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:15.925 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:15.925 BaseBdev2 00:20:15.925 BaseBdev3 00:20:15.925 BaseBdev4' 00:20:15.925 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:15.925 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:15.925 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:16.220 "name": "NewBaseBdev", 00:20:16.220 "aliases": [ 00:20:16.220 "70979a0a-e479-4e71-b398-751aa6458a43" 00:20:16.220 ], 00:20:16.220 "product_name": "Malloc disk", 00:20:16.220 "block_size": 512, 00:20:16.220 "num_blocks": 65536, 00:20:16.220 "uuid": "70979a0a-e479-4e71-b398-751aa6458a43", 00:20:16.220 "assigned_rate_limits": { 00:20:16.220 "rw_ios_per_sec": 0, 00:20:16.220 "rw_mbytes_per_sec": 0, 00:20:16.220 "r_mbytes_per_sec": 0, 00:20:16.220 "w_mbytes_per_sec": 0 00:20:16.220 }, 00:20:16.220 "claimed": true, 00:20:16.220 "claim_type": "exclusive_write", 00:20:16.220 "zoned": false, 00:20:16.220 "supported_io_types": { 00:20:16.220 "read": true, 00:20:16.220 "write": true, 00:20:16.220 "unmap": true, 00:20:16.220 "flush": true, 00:20:16.220 "reset": true, 00:20:16.220 "nvme_admin": false, 00:20:16.220 "nvme_io": false, 00:20:16.220 "nvme_io_md": false, 00:20:16.220 "write_zeroes": true, 00:20:16.220 "zcopy": true, 00:20:16.220 "get_zone_info": false, 00:20:16.220 "zone_management": false, 00:20:16.220 "zone_append": false, 00:20:16.220 "compare": false, 00:20:16.220 "compare_and_write": false, 00:20:16.220 "abort": true, 00:20:16.220 "seek_hole": false, 00:20:16.220 "seek_data": false, 00:20:16.220 "copy": true, 00:20:16.220 "nvme_iov_md": false 00:20:16.220 }, 00:20:16.220 "memory_domains": [ 00:20:16.220 { 00:20:16.220 "dma_device_id": "system", 00:20:16.220 "dma_device_type": 1 00:20:16.220 }, 00:20:16.220 { 00:20:16.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.220 "dma_device_type": 2 00:20:16.220 } 00:20:16.220 ], 00:20:16.220 "driver_specific": {} 00:20:16.220 }' 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:16.220 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.478 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.478 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:16.478 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:16.478 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:16.478 12:01:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:16.737 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:16.737 "name": "BaseBdev2", 00:20:16.737 "aliases": [ 00:20:16.737 "ec45357c-98f0-4618-bf8d-e91369776704" 00:20:16.737 ], 00:20:16.737 "product_name": "Malloc disk", 00:20:16.737 "block_size": 512, 00:20:16.737 "num_blocks": 65536, 00:20:16.737 "uuid": "ec45357c-98f0-4618-bf8d-e91369776704", 00:20:16.737 "assigned_rate_limits": { 00:20:16.737 "rw_ios_per_sec": 0, 00:20:16.737 "rw_mbytes_per_sec": 0, 00:20:16.737 "r_mbytes_per_sec": 0, 00:20:16.737 "w_mbytes_per_sec": 0 00:20:16.737 }, 00:20:16.737 "claimed": true, 00:20:16.737 "claim_type": "exclusive_write", 00:20:16.737 "zoned": false, 00:20:16.737 "supported_io_types": { 00:20:16.737 "read": true, 00:20:16.737 "write": true, 00:20:16.737 "unmap": true, 00:20:16.737 "flush": true, 00:20:16.737 "reset": true, 00:20:16.737 "nvme_admin": false, 00:20:16.737 "nvme_io": false, 00:20:16.737 "nvme_io_md": false, 00:20:16.737 "write_zeroes": true, 00:20:16.737 "zcopy": true, 00:20:16.737 "get_zone_info": false, 00:20:16.737 "zone_management": false, 00:20:16.737 "zone_append": false, 00:20:16.737 "compare": false, 00:20:16.737 "compare_and_write": false, 00:20:16.737 "abort": true, 00:20:16.737 "seek_hole": false, 00:20:16.737 "seek_data": false, 00:20:16.737 "copy": true, 00:20:16.737 "nvme_iov_md": false 00:20:16.737 }, 00:20:16.737 "memory_domains": [ 00:20:16.737 { 00:20:16.737 "dma_device_id": "system", 00:20:16.737 "dma_device_type": 1 00:20:16.737 }, 00:20:16.737 { 00:20:16.737 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:16.737 "dma_device_type": 2 00:20:16.737 } 00:20:16.737 ], 00:20:16.737 "driver_specific": {} 00:20:16.737 }' 00:20:16.737 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.737 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:16.737 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:16.737 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.737 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:16.996 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:17.255 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:17.255 "name": "BaseBdev3", 00:20:17.255 "aliases": [ 00:20:17.255 "931c2558-0bef-4c46-afab-f70c13c9d4fc" 00:20:17.255 ], 00:20:17.255 "product_name": "Malloc disk", 00:20:17.255 "block_size": 512, 00:20:17.255 "num_blocks": 65536, 00:20:17.255 "uuid": "931c2558-0bef-4c46-afab-f70c13c9d4fc", 00:20:17.255 "assigned_rate_limits": { 00:20:17.255 "rw_ios_per_sec": 0, 00:20:17.255 "rw_mbytes_per_sec": 0, 00:20:17.255 "r_mbytes_per_sec": 0, 00:20:17.255 "w_mbytes_per_sec": 0 00:20:17.255 }, 00:20:17.255 "claimed": true, 00:20:17.255 "claim_type": "exclusive_write", 00:20:17.255 "zoned": false, 00:20:17.255 "supported_io_types": { 00:20:17.255 "read": true, 00:20:17.255 "write": true, 00:20:17.255 "unmap": true, 00:20:17.255 "flush": true, 00:20:17.255 "reset": true, 00:20:17.255 "nvme_admin": false, 00:20:17.255 "nvme_io": false, 00:20:17.255 "nvme_io_md": false, 00:20:17.255 "write_zeroes": true, 00:20:17.255 "zcopy": true, 00:20:17.255 "get_zone_info": false, 00:20:17.255 "zone_management": false, 00:20:17.255 "zone_append": false, 00:20:17.255 "compare": false, 00:20:17.255 "compare_and_write": false, 00:20:17.255 "abort": true, 00:20:17.255 "seek_hole": false, 00:20:17.255 "seek_data": false, 00:20:17.255 "copy": true, 00:20:17.255 "nvme_iov_md": false 00:20:17.255 }, 00:20:17.255 "memory_domains": [ 00:20:17.255 { 00:20:17.255 "dma_device_id": "system", 00:20:17.255 "dma_device_type": 1 00:20:17.255 }, 00:20:17.255 { 00:20:17.255 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.255 "dma_device_type": 2 00:20:17.255 } 00:20:17.255 ], 00:20:17.255 "driver_specific": {} 00:20:17.255 }' 00:20:17.255 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.255 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:17.255 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:17.255 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.514 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:17.514 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:17.514 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.514 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:17.514 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:17.514 12:01:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.514 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:17.514 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:17.514 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:17.514 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:17.514 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:17.773 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:17.773 "name": "BaseBdev4", 00:20:17.773 "aliases": [ 00:20:17.773 "b21f87e3-94cc-48e6-8dce-eabc4a3723c8" 00:20:17.773 ], 00:20:17.773 "product_name": "Malloc disk", 00:20:17.773 "block_size": 512, 00:20:17.773 "num_blocks": 65536, 00:20:17.773 "uuid": "b21f87e3-94cc-48e6-8dce-eabc4a3723c8", 00:20:17.773 "assigned_rate_limits": { 00:20:17.773 "rw_ios_per_sec": 0, 00:20:17.773 "rw_mbytes_per_sec": 0, 00:20:17.773 "r_mbytes_per_sec": 0, 00:20:17.773 "w_mbytes_per_sec": 0 00:20:17.773 }, 00:20:17.773 "claimed": true, 00:20:17.773 "claim_type": "exclusive_write", 00:20:17.773 "zoned": false, 00:20:17.773 "supported_io_types": { 00:20:17.773 "read": true, 00:20:17.773 "write": true, 00:20:17.773 "unmap": true, 00:20:17.773 "flush": true, 00:20:17.773 "reset": true, 00:20:17.773 "nvme_admin": false, 00:20:17.773 "nvme_io": false, 00:20:17.773 "nvme_io_md": false, 00:20:17.773 "write_zeroes": true, 00:20:17.773 "zcopy": true, 00:20:17.773 "get_zone_info": false, 00:20:17.773 "zone_management": false, 00:20:17.773 "zone_append": false, 00:20:17.773 "compare": false, 00:20:17.773 "compare_and_write": false, 00:20:17.773 "abort": true, 00:20:17.773 "seek_hole": false, 00:20:17.773 "seek_data": false, 00:20:17.773 "copy": true, 00:20:17.773 "nvme_iov_md": false 00:20:17.773 }, 00:20:17.773 "memory_domains": [ 00:20:17.773 { 00:20:17.773 "dma_device_id": "system", 00:20:17.773 "dma_device_type": 1 00:20:17.773 }, 00:20:17.773 { 00:20:17.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:17.773 "dma_device_type": 2 00:20:17.773 } 00:20:17.773 ], 00:20:17.773 "driver_specific": {} 00:20:17.773 }' 00:20:17.773 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.032 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:18.032 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:18.032 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.032 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:18.032 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:18.032 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.032 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:18.032 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:18.032 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.290 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:18.291 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:18.291 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:18.550 [2024-07-15 12:01:31.925707] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:18.550 [2024-07-15 12:01:31.925738] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:18.550 [2024-07-15 12:01:31.925795] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:18.550 [2024-07-15 12:01:31.925855] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:18.550 [2024-07-15 12:01:31.925867] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x242e120 name Existed_Raid, state offline 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1522300 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1522300 ']' 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1522300 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1522300 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1522300' 00:20:18.550 killing process with pid 1522300 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1522300 00:20:18.550 [2024-07-15 12:01:31.992237] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:18.550 12:01:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1522300 00:20:18.550 [2024-07-15 12:01:32.029133] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:18.810 12:01:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:18.810 00:20:18.810 real 0m33.248s 00:20:18.810 user 1m0.990s 00:20:18.810 sys 0m5.983s 00:20:18.810 12:01:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:18.810 12:01:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:18.810 ************************************ 00:20:18.810 END TEST raid_state_function_test_sb 00:20:18.810 ************************************ 00:20:18.810 12:01:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:18.810 12:01:32 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:20:18.810 12:01:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:20:18.810 12:01:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:18.810 12:01:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:18.810 ************************************ 00:20:18.810 START TEST raid_superblock_test 00:20:18.810 ************************************ 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid0 4 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1527299 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1527299 /var/tmp/spdk-raid.sock 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1527299 ']' 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:18.810 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:18.810 12:01:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:18.810 [2024-07-15 12:01:32.395084] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:20:18.810 [2024-07-15 12:01:32.395151] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1527299 ] 00:20:19.070 [2024-07-15 12:01:32.525955] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:19.070 [2024-07-15 12:01:32.631742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:19.329 [2024-07-15 12:01:32.700514] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:19.329 [2024-07-15 12:01:32.700558] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:19.897 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:20.156 malloc1 00:20:20.156 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:20.415 [2024-07-15 12:01:33.814694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:20.415 [2024-07-15 12:01:33.814741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:20.415 [2024-07-15 12:01:33.814763] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd65560 00:20:20.415 [2024-07-15 12:01:33.814775] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:20.415 [2024-07-15 12:01:33.816347] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:20.415 [2024-07-15 12:01:33.816375] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:20.415 pt1 00:20:20.415 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:20.415 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:20.415 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:20.415 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:20.415 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:20.415 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:20.415 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:20.415 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:20.415 12:01:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:20.675 malloc2 00:20:20.675 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:20.934 [2024-07-15 12:01:34.364977] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:20.934 [2024-07-15 12:01:34.365026] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:20.934 [2024-07-15 12:01:34.365043] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe035b0 00:20:20.934 [2024-07-15 12:01:34.365056] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:20.934 [2024-07-15 12:01:34.366601] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:20.934 [2024-07-15 12:01:34.366629] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:20.934 pt2 00:20:20.934 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:20.934 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:20.934 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:20.934 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:20.934 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:20.934 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:20.934 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:20.934 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:20.934 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:21.193 malloc3 00:20:21.193 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:21.453 [2024-07-15 12:01:34.860093] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:21.453 [2024-07-15 12:01:34.860139] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:21.453 [2024-07-15 12:01:34.860156] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe03be0 00:20:21.453 [2024-07-15 12:01:34.860168] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:21.453 [2024-07-15 12:01:34.861755] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:21.453 [2024-07-15 12:01:34.861786] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:21.453 pt3 00:20:21.453 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:21.453 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:21.453 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:21.453 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:21.453 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:21.453 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:21.453 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:21.453 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:21.453 12:01:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:21.712 malloc4 00:20:21.712 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:21.972 [2024-07-15 12:01:35.343157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:21.972 [2024-07-15 12:01:35.343202] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:21.972 [2024-07-15 12:01:35.343219] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe06f00 00:20:21.972 [2024-07-15 12:01:35.343237] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:21.972 [2024-07-15 12:01:35.344775] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:21.972 [2024-07-15 12:01:35.344803] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:21.972 pt4 00:20:21.972 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:21.972 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:21.972 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:22.231 [2024-07-15 12:01:35.583816] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:22.231 [2024-07-15 12:01:35.585135] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:22.231 [2024-07-15 12:01:35.585191] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:22.232 [2024-07-15 12:01:35.585235] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:22.232 [2024-07-15 12:01:35.585405] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe06880 00:20:22.232 [2024-07-15 12:01:35.585416] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:22.232 [2024-07-15 12:01:35.585619] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe06850 00:20:22.232 [2024-07-15 12:01:35.585776] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe06880 00:20:22.232 [2024-07-15 12:01:35.585786] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe06880 00:20:22.232 [2024-07-15 12:01:35.585888] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:22.232 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:22.491 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:22.491 "name": "raid_bdev1", 00:20:22.491 "uuid": "7b4987e9-d684-41d4-a73c-082b014c4101", 00:20:22.491 "strip_size_kb": 64, 00:20:22.491 "state": "online", 00:20:22.491 "raid_level": "raid0", 00:20:22.491 "superblock": true, 00:20:22.491 "num_base_bdevs": 4, 00:20:22.491 "num_base_bdevs_discovered": 4, 00:20:22.491 "num_base_bdevs_operational": 4, 00:20:22.491 "base_bdevs_list": [ 00:20:22.491 { 00:20:22.491 "name": "pt1", 00:20:22.491 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:22.491 "is_configured": true, 00:20:22.491 "data_offset": 2048, 00:20:22.491 "data_size": 63488 00:20:22.491 }, 00:20:22.491 { 00:20:22.491 "name": "pt2", 00:20:22.491 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:22.491 "is_configured": true, 00:20:22.491 "data_offset": 2048, 00:20:22.491 "data_size": 63488 00:20:22.491 }, 00:20:22.491 { 00:20:22.491 "name": "pt3", 00:20:22.491 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:22.491 "is_configured": true, 00:20:22.491 "data_offset": 2048, 00:20:22.491 "data_size": 63488 00:20:22.491 }, 00:20:22.491 { 00:20:22.491 "name": "pt4", 00:20:22.491 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:22.491 "is_configured": true, 00:20:22.491 "data_offset": 2048, 00:20:22.491 "data_size": 63488 00:20:22.491 } 00:20:22.491 ] 00:20:22.491 }' 00:20:22.491 12:01:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:22.491 12:01:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:23.060 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:23.060 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:23.060 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:23.060 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:23.060 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:23.060 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:23.060 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:23.060 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:23.319 [2024-07-15 12:01:36.679003] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:23.319 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:23.319 "name": "raid_bdev1", 00:20:23.319 "aliases": [ 00:20:23.319 "7b4987e9-d684-41d4-a73c-082b014c4101" 00:20:23.319 ], 00:20:23.319 "product_name": "Raid Volume", 00:20:23.319 "block_size": 512, 00:20:23.319 "num_blocks": 253952, 00:20:23.319 "uuid": "7b4987e9-d684-41d4-a73c-082b014c4101", 00:20:23.319 "assigned_rate_limits": { 00:20:23.319 "rw_ios_per_sec": 0, 00:20:23.319 "rw_mbytes_per_sec": 0, 00:20:23.319 "r_mbytes_per_sec": 0, 00:20:23.319 "w_mbytes_per_sec": 0 00:20:23.319 }, 00:20:23.319 "claimed": false, 00:20:23.319 "zoned": false, 00:20:23.319 "supported_io_types": { 00:20:23.319 "read": true, 00:20:23.319 "write": true, 00:20:23.319 "unmap": true, 00:20:23.319 "flush": true, 00:20:23.319 "reset": true, 00:20:23.319 "nvme_admin": false, 00:20:23.319 "nvme_io": false, 00:20:23.319 "nvme_io_md": false, 00:20:23.319 "write_zeroes": true, 00:20:23.319 "zcopy": false, 00:20:23.319 "get_zone_info": false, 00:20:23.319 "zone_management": false, 00:20:23.319 "zone_append": false, 00:20:23.319 "compare": false, 00:20:23.319 "compare_and_write": false, 00:20:23.319 "abort": false, 00:20:23.319 "seek_hole": false, 00:20:23.319 "seek_data": false, 00:20:23.319 "copy": false, 00:20:23.319 "nvme_iov_md": false 00:20:23.319 }, 00:20:23.319 "memory_domains": [ 00:20:23.319 { 00:20:23.319 "dma_device_id": "system", 00:20:23.319 "dma_device_type": 1 00:20:23.319 }, 00:20:23.319 { 00:20:23.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.319 "dma_device_type": 2 00:20:23.319 }, 00:20:23.319 { 00:20:23.319 "dma_device_id": "system", 00:20:23.319 "dma_device_type": 1 00:20:23.319 }, 00:20:23.319 { 00:20:23.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.319 "dma_device_type": 2 00:20:23.319 }, 00:20:23.319 { 00:20:23.319 "dma_device_id": "system", 00:20:23.319 "dma_device_type": 1 00:20:23.319 }, 00:20:23.319 { 00:20:23.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.319 "dma_device_type": 2 00:20:23.319 }, 00:20:23.319 { 00:20:23.319 "dma_device_id": "system", 00:20:23.319 "dma_device_type": 1 00:20:23.319 }, 00:20:23.319 { 00:20:23.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.319 "dma_device_type": 2 00:20:23.319 } 00:20:23.319 ], 00:20:23.319 "driver_specific": { 00:20:23.319 "raid": { 00:20:23.319 "uuid": "7b4987e9-d684-41d4-a73c-082b014c4101", 00:20:23.319 "strip_size_kb": 64, 00:20:23.319 "state": "online", 00:20:23.319 "raid_level": "raid0", 00:20:23.319 "superblock": true, 00:20:23.319 "num_base_bdevs": 4, 00:20:23.319 "num_base_bdevs_discovered": 4, 00:20:23.319 "num_base_bdevs_operational": 4, 00:20:23.319 "base_bdevs_list": [ 00:20:23.319 { 00:20:23.319 "name": "pt1", 00:20:23.319 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:23.319 "is_configured": true, 00:20:23.319 "data_offset": 2048, 00:20:23.319 "data_size": 63488 00:20:23.319 }, 00:20:23.319 { 00:20:23.319 "name": "pt2", 00:20:23.320 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:23.320 "is_configured": true, 00:20:23.320 "data_offset": 2048, 00:20:23.320 "data_size": 63488 00:20:23.320 }, 00:20:23.320 { 00:20:23.320 "name": "pt3", 00:20:23.320 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:23.320 "is_configured": true, 00:20:23.320 "data_offset": 2048, 00:20:23.320 "data_size": 63488 00:20:23.320 }, 00:20:23.320 { 00:20:23.320 "name": "pt4", 00:20:23.320 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:23.320 "is_configured": true, 00:20:23.320 "data_offset": 2048, 00:20:23.320 "data_size": 63488 00:20:23.320 } 00:20:23.320 ] 00:20:23.320 } 00:20:23.320 } 00:20:23.320 }' 00:20:23.320 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:23.320 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:23.320 pt2 00:20:23.320 pt3 00:20:23.320 pt4' 00:20:23.320 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.320 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:23.320 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.579 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.579 "name": "pt1", 00:20:23.579 "aliases": [ 00:20:23.579 "00000000-0000-0000-0000-000000000001" 00:20:23.579 ], 00:20:23.579 "product_name": "passthru", 00:20:23.579 "block_size": 512, 00:20:23.579 "num_blocks": 65536, 00:20:23.579 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:23.579 "assigned_rate_limits": { 00:20:23.579 "rw_ios_per_sec": 0, 00:20:23.579 "rw_mbytes_per_sec": 0, 00:20:23.579 "r_mbytes_per_sec": 0, 00:20:23.579 "w_mbytes_per_sec": 0 00:20:23.579 }, 00:20:23.579 "claimed": true, 00:20:23.579 "claim_type": "exclusive_write", 00:20:23.579 "zoned": false, 00:20:23.579 "supported_io_types": { 00:20:23.579 "read": true, 00:20:23.579 "write": true, 00:20:23.579 "unmap": true, 00:20:23.579 "flush": true, 00:20:23.579 "reset": true, 00:20:23.579 "nvme_admin": false, 00:20:23.579 "nvme_io": false, 00:20:23.579 "nvme_io_md": false, 00:20:23.579 "write_zeroes": true, 00:20:23.579 "zcopy": true, 00:20:23.579 "get_zone_info": false, 00:20:23.579 "zone_management": false, 00:20:23.579 "zone_append": false, 00:20:23.579 "compare": false, 00:20:23.579 "compare_and_write": false, 00:20:23.579 "abort": true, 00:20:23.579 "seek_hole": false, 00:20:23.579 "seek_data": false, 00:20:23.579 "copy": true, 00:20:23.579 "nvme_iov_md": false 00:20:23.579 }, 00:20:23.579 "memory_domains": [ 00:20:23.579 { 00:20:23.579 "dma_device_id": "system", 00:20:23.579 "dma_device_type": 1 00:20:23.579 }, 00:20:23.579 { 00:20:23.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.579 "dma_device_type": 2 00:20:23.579 } 00:20:23.579 ], 00:20:23.579 "driver_specific": { 00:20:23.579 "passthru": { 00:20:23.579 "name": "pt1", 00:20:23.579 "base_bdev_name": "malloc1" 00:20:23.579 } 00:20:23.579 } 00:20:23.579 }' 00:20:23.579 12:01:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.579 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.579 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.579 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.579 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.579 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.579 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.838 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.838 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.838 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.838 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.838 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.838 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.838 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:23.838 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:24.097 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:24.097 "name": "pt2", 00:20:24.097 "aliases": [ 00:20:24.097 "00000000-0000-0000-0000-000000000002" 00:20:24.097 ], 00:20:24.097 "product_name": "passthru", 00:20:24.097 "block_size": 512, 00:20:24.097 "num_blocks": 65536, 00:20:24.097 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:24.097 "assigned_rate_limits": { 00:20:24.097 "rw_ios_per_sec": 0, 00:20:24.097 "rw_mbytes_per_sec": 0, 00:20:24.097 "r_mbytes_per_sec": 0, 00:20:24.097 "w_mbytes_per_sec": 0 00:20:24.097 }, 00:20:24.097 "claimed": true, 00:20:24.097 "claim_type": "exclusive_write", 00:20:24.097 "zoned": false, 00:20:24.097 "supported_io_types": { 00:20:24.097 "read": true, 00:20:24.097 "write": true, 00:20:24.097 "unmap": true, 00:20:24.097 "flush": true, 00:20:24.097 "reset": true, 00:20:24.097 "nvme_admin": false, 00:20:24.097 "nvme_io": false, 00:20:24.097 "nvme_io_md": false, 00:20:24.097 "write_zeroes": true, 00:20:24.097 "zcopy": true, 00:20:24.097 "get_zone_info": false, 00:20:24.097 "zone_management": false, 00:20:24.097 "zone_append": false, 00:20:24.097 "compare": false, 00:20:24.097 "compare_and_write": false, 00:20:24.097 "abort": true, 00:20:24.097 "seek_hole": false, 00:20:24.097 "seek_data": false, 00:20:24.097 "copy": true, 00:20:24.097 "nvme_iov_md": false 00:20:24.097 }, 00:20:24.097 "memory_domains": [ 00:20:24.097 { 00:20:24.097 "dma_device_id": "system", 00:20:24.097 "dma_device_type": 1 00:20:24.097 }, 00:20:24.097 { 00:20:24.097 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.097 "dma_device_type": 2 00:20:24.097 } 00:20:24.097 ], 00:20:24.097 "driver_specific": { 00:20:24.097 "passthru": { 00:20:24.097 "name": "pt2", 00:20:24.097 "base_bdev_name": "malloc2" 00:20:24.097 } 00:20:24.097 } 00:20:24.097 }' 00:20:24.097 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.097 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.097 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:24.097 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:24.357 12:01:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:24.617 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:24.617 "name": "pt3", 00:20:24.617 "aliases": [ 00:20:24.617 "00000000-0000-0000-0000-000000000003" 00:20:24.617 ], 00:20:24.617 "product_name": "passthru", 00:20:24.617 "block_size": 512, 00:20:24.617 "num_blocks": 65536, 00:20:24.617 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:24.617 "assigned_rate_limits": { 00:20:24.617 "rw_ios_per_sec": 0, 00:20:24.617 "rw_mbytes_per_sec": 0, 00:20:24.617 "r_mbytes_per_sec": 0, 00:20:24.617 "w_mbytes_per_sec": 0 00:20:24.617 }, 00:20:24.617 "claimed": true, 00:20:24.617 "claim_type": "exclusive_write", 00:20:24.617 "zoned": false, 00:20:24.617 "supported_io_types": { 00:20:24.617 "read": true, 00:20:24.617 "write": true, 00:20:24.617 "unmap": true, 00:20:24.617 "flush": true, 00:20:24.617 "reset": true, 00:20:24.617 "nvme_admin": false, 00:20:24.617 "nvme_io": false, 00:20:24.617 "nvme_io_md": false, 00:20:24.617 "write_zeroes": true, 00:20:24.617 "zcopy": true, 00:20:24.617 "get_zone_info": false, 00:20:24.617 "zone_management": false, 00:20:24.617 "zone_append": false, 00:20:24.617 "compare": false, 00:20:24.617 "compare_and_write": false, 00:20:24.617 "abort": true, 00:20:24.617 "seek_hole": false, 00:20:24.617 "seek_data": false, 00:20:24.617 "copy": true, 00:20:24.617 "nvme_iov_md": false 00:20:24.617 }, 00:20:24.617 "memory_domains": [ 00:20:24.617 { 00:20:24.617 "dma_device_id": "system", 00:20:24.617 "dma_device_type": 1 00:20:24.617 }, 00:20:24.617 { 00:20:24.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.617 "dma_device_type": 2 00:20:24.617 } 00:20:24.617 ], 00:20:24.617 "driver_specific": { 00:20:24.617 "passthru": { 00:20:24.617 "name": "pt3", 00:20:24.617 "base_bdev_name": "malloc3" 00:20:24.617 } 00:20:24.617 } 00:20:24.617 }' 00:20:24.617 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.877 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.877 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:24.877 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.877 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.877 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:24.877 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.877 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.877 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:24.877 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.136 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.136 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:25.136 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:25.136 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:25.136 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:25.396 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:25.396 "name": "pt4", 00:20:25.396 "aliases": [ 00:20:25.396 "00000000-0000-0000-0000-000000000004" 00:20:25.396 ], 00:20:25.396 "product_name": "passthru", 00:20:25.396 "block_size": 512, 00:20:25.396 "num_blocks": 65536, 00:20:25.396 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:25.396 "assigned_rate_limits": { 00:20:25.396 "rw_ios_per_sec": 0, 00:20:25.396 "rw_mbytes_per_sec": 0, 00:20:25.396 "r_mbytes_per_sec": 0, 00:20:25.396 "w_mbytes_per_sec": 0 00:20:25.396 }, 00:20:25.396 "claimed": true, 00:20:25.396 "claim_type": "exclusive_write", 00:20:25.396 "zoned": false, 00:20:25.396 "supported_io_types": { 00:20:25.396 "read": true, 00:20:25.396 "write": true, 00:20:25.396 "unmap": true, 00:20:25.396 "flush": true, 00:20:25.396 "reset": true, 00:20:25.396 "nvme_admin": false, 00:20:25.396 "nvme_io": false, 00:20:25.396 "nvme_io_md": false, 00:20:25.396 "write_zeroes": true, 00:20:25.396 "zcopy": true, 00:20:25.396 "get_zone_info": false, 00:20:25.396 "zone_management": false, 00:20:25.396 "zone_append": false, 00:20:25.396 "compare": false, 00:20:25.396 "compare_and_write": false, 00:20:25.396 "abort": true, 00:20:25.396 "seek_hole": false, 00:20:25.396 "seek_data": false, 00:20:25.396 "copy": true, 00:20:25.396 "nvme_iov_md": false 00:20:25.396 }, 00:20:25.396 "memory_domains": [ 00:20:25.396 { 00:20:25.396 "dma_device_id": "system", 00:20:25.396 "dma_device_type": 1 00:20:25.396 }, 00:20:25.396 { 00:20:25.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:25.396 "dma_device_type": 2 00:20:25.396 } 00:20:25.396 ], 00:20:25.396 "driver_specific": { 00:20:25.396 "passthru": { 00:20:25.396 "name": "pt4", 00:20:25.396 "base_bdev_name": "malloc4" 00:20:25.396 } 00:20:25.396 } 00:20:25.396 }' 00:20:25.396 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:25.396 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:25.396 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:25.396 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:25.396 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:25.396 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:25.396 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:25.655 12:01:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:25.655 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:25.655 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.655 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.655 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:25.655 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:25.655 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:25.913 [2024-07-15 12:01:39.289918] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:25.913 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=7b4987e9-d684-41d4-a73c-082b014c4101 00:20:25.913 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 7b4987e9-d684-41d4-a73c-082b014c4101 ']' 00:20:25.913 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:26.170 [2024-07-15 12:01:39.538318] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:26.170 [2024-07-15 12:01:39.538342] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:26.170 [2024-07-15 12:01:39.538397] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:26.170 [2024-07-15 12:01:39.538459] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:26.170 [2024-07-15 12:01:39.538476] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe06880 name raid_bdev1, state offline 00:20:26.170 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.170 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:26.427 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:26.427 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:26.427 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:26.427 12:01:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:26.685 12:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:26.685 12:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:26.944 12:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:26.944 12:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:27.202 12:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:27.202 12:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:27.462 12:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:27.462 12:01:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:27.462 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:27.722 [2024-07-15 12:01:41.270831] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:27.722 [2024-07-15 12:01:41.272179] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:27.722 [2024-07-15 12:01:41.272222] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:27.722 [2024-07-15 12:01:41.272256] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:27.722 [2024-07-15 12:01:41.272301] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:27.722 [2024-07-15 12:01:41.272341] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:27.722 [2024-07-15 12:01:41.272364] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:27.722 [2024-07-15 12:01:41.272385] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:27.722 [2024-07-15 12:01:41.272404] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:27.722 [2024-07-15 12:01:41.272413] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd68080 name raid_bdev1, state configuring 00:20:27.722 request: 00:20:27.722 { 00:20:27.722 "name": "raid_bdev1", 00:20:27.722 "raid_level": "raid0", 00:20:27.722 "base_bdevs": [ 00:20:27.722 "malloc1", 00:20:27.722 "malloc2", 00:20:27.722 "malloc3", 00:20:27.722 "malloc4" 00:20:27.722 ], 00:20:27.722 "strip_size_kb": 64, 00:20:27.722 "superblock": false, 00:20:27.722 "method": "bdev_raid_create", 00:20:27.722 "req_id": 1 00:20:27.722 } 00:20:27.722 Got JSON-RPC error response 00:20:27.722 response: 00:20:27.722 { 00:20:27.722 "code": -17, 00:20:27.722 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:27.722 } 00:20:27.722 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:20:27.722 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:20:27.722 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:20:27.722 12:01:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:20:27.722 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:27.722 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.981 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:27.981 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:27.981 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:28.240 [2024-07-15 12:01:41.756045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:28.240 [2024-07-15 12:01:41.756094] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:28.240 [2024-07-15 12:01:41.756112] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe051f0 00:20:28.240 [2024-07-15 12:01:41.756125] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:28.240 [2024-07-15 12:01:41.757741] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:28.240 [2024-07-15 12:01:41.757773] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:28.240 [2024-07-15 12:01:41.757838] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:28.240 [2024-07-15 12:01:41.757864] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:28.240 pt1 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.240 12:01:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:28.498 12:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.498 "name": "raid_bdev1", 00:20:28.498 "uuid": "7b4987e9-d684-41d4-a73c-082b014c4101", 00:20:28.498 "strip_size_kb": 64, 00:20:28.498 "state": "configuring", 00:20:28.498 "raid_level": "raid0", 00:20:28.498 "superblock": true, 00:20:28.498 "num_base_bdevs": 4, 00:20:28.498 "num_base_bdevs_discovered": 1, 00:20:28.498 "num_base_bdevs_operational": 4, 00:20:28.498 "base_bdevs_list": [ 00:20:28.498 { 00:20:28.498 "name": "pt1", 00:20:28.498 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:28.498 "is_configured": true, 00:20:28.498 "data_offset": 2048, 00:20:28.498 "data_size": 63488 00:20:28.498 }, 00:20:28.498 { 00:20:28.498 "name": null, 00:20:28.498 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:28.498 "is_configured": false, 00:20:28.498 "data_offset": 2048, 00:20:28.498 "data_size": 63488 00:20:28.499 }, 00:20:28.499 { 00:20:28.499 "name": null, 00:20:28.499 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:28.499 "is_configured": false, 00:20:28.499 "data_offset": 2048, 00:20:28.499 "data_size": 63488 00:20:28.499 }, 00:20:28.499 { 00:20:28.499 "name": null, 00:20:28.499 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:28.499 "is_configured": false, 00:20:28.499 "data_offset": 2048, 00:20:28.499 "data_size": 63488 00:20:28.499 } 00:20:28.499 ] 00:20:28.499 }' 00:20:28.499 12:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.499 12:01:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.435 12:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:29.435 12:01:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:29.694 [2024-07-15 12:01:43.091582] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:29.694 [2024-07-15 12:01:43.091635] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:29.694 [2024-07-15 12:01:43.091655] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd5c660 00:20:29.694 [2024-07-15 12:01:43.091668] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:29.694 [2024-07-15 12:01:43.092034] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:29.694 [2024-07-15 12:01:43.092055] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:29.694 [2024-07-15 12:01:43.092116] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:29.694 [2024-07-15 12:01:43.092135] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:29.694 pt2 00:20:29.694 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:29.694 [2024-07-15 12:01:43.280092] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:29.951 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:30.209 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:30.209 "name": "raid_bdev1", 00:20:30.209 "uuid": "7b4987e9-d684-41d4-a73c-082b014c4101", 00:20:30.209 "strip_size_kb": 64, 00:20:30.209 "state": "configuring", 00:20:30.209 "raid_level": "raid0", 00:20:30.209 "superblock": true, 00:20:30.209 "num_base_bdevs": 4, 00:20:30.209 "num_base_bdevs_discovered": 1, 00:20:30.209 "num_base_bdevs_operational": 4, 00:20:30.209 "base_bdevs_list": [ 00:20:30.209 { 00:20:30.209 "name": "pt1", 00:20:30.209 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:30.209 "is_configured": true, 00:20:30.209 "data_offset": 2048, 00:20:30.209 "data_size": 63488 00:20:30.209 }, 00:20:30.209 { 00:20:30.209 "name": null, 00:20:30.209 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:30.209 "is_configured": false, 00:20:30.209 "data_offset": 2048, 00:20:30.209 "data_size": 63488 00:20:30.209 }, 00:20:30.209 { 00:20:30.209 "name": null, 00:20:30.209 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:30.209 "is_configured": false, 00:20:30.209 "data_offset": 2048, 00:20:30.209 "data_size": 63488 00:20:30.209 }, 00:20:30.209 { 00:20:30.209 "name": null, 00:20:30.209 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:30.209 "is_configured": false, 00:20:30.209 "data_offset": 2048, 00:20:30.209 "data_size": 63488 00:20:30.209 } 00:20:30.209 ] 00:20:30.209 }' 00:20:30.209 12:01:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:30.209 12:01:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:30.774 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:30.774 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:30.774 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:31.033 [2024-07-15 12:01:44.403054] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:31.033 [2024-07-15 12:01:44.403105] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:31.033 [2024-07-15 12:01:44.403124] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd5c890 00:20:31.033 [2024-07-15 12:01:44.403136] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:31.033 [2024-07-15 12:01:44.403473] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:31.033 [2024-07-15 12:01:44.403493] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:31.033 [2024-07-15 12:01:44.403558] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:31.033 [2024-07-15 12:01:44.403577] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:31.033 pt2 00:20:31.033 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:31.033 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:31.033 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:31.292 [2024-07-15 12:01:44.655731] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:31.292 [2024-07-15 12:01:44.655777] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:31.292 [2024-07-15 12:01:44.655796] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe08780 00:20:31.292 [2024-07-15 12:01:44.655808] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:31.292 [2024-07-15 12:01:44.656129] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:31.292 [2024-07-15 12:01:44.656152] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:31.292 [2024-07-15 12:01:44.656213] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:31.292 [2024-07-15 12:01:44.656231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:31.292 pt3 00:20:31.292 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:31.292 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:31.292 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:31.551 [2024-07-15 12:01:44.904393] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:31.551 [2024-07-15 12:01:44.904432] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:31.551 [2024-07-15 12:01:44.904448] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd67560 00:20:31.551 [2024-07-15 12:01:44.904460] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:31.551 [2024-07-15 12:01:44.904771] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:31.551 [2024-07-15 12:01:44.904789] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:31.551 [2024-07-15 12:01:44.904845] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:31.551 [2024-07-15 12:01:44.904863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:31.551 [2024-07-15 12:01:44.904982] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd5bf60 00:20:31.551 [2024-07-15 12:01:44.904992] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:31.551 [2024-07-15 12:01:44.905162] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe06850 00:20:31.551 [2024-07-15 12:01:44.905287] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd5bf60 00:20:31.551 [2024-07-15 12:01:44.905296] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xd5bf60 00:20:31.551 [2024-07-15 12:01:44.905390] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:31.551 pt4 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.551 12:01:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:31.551 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.551 "name": "raid_bdev1", 00:20:31.551 "uuid": "7b4987e9-d684-41d4-a73c-082b014c4101", 00:20:31.551 "strip_size_kb": 64, 00:20:31.551 "state": "online", 00:20:31.552 "raid_level": "raid0", 00:20:31.552 "superblock": true, 00:20:31.552 "num_base_bdevs": 4, 00:20:31.552 "num_base_bdevs_discovered": 4, 00:20:31.552 "num_base_bdevs_operational": 4, 00:20:31.552 "base_bdevs_list": [ 00:20:31.552 { 00:20:31.552 "name": "pt1", 00:20:31.552 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:31.552 "is_configured": true, 00:20:31.552 "data_offset": 2048, 00:20:31.552 "data_size": 63488 00:20:31.552 }, 00:20:31.552 { 00:20:31.552 "name": "pt2", 00:20:31.552 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:31.552 "is_configured": true, 00:20:31.552 "data_offset": 2048, 00:20:31.552 "data_size": 63488 00:20:31.552 }, 00:20:31.552 { 00:20:31.552 "name": "pt3", 00:20:31.552 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:31.552 "is_configured": true, 00:20:31.552 "data_offset": 2048, 00:20:31.552 "data_size": 63488 00:20:31.552 }, 00:20:31.552 { 00:20:31.552 "name": "pt4", 00:20:31.552 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:31.552 "is_configured": true, 00:20:31.552 "data_offset": 2048, 00:20:31.552 "data_size": 63488 00:20:31.552 } 00:20:31.552 ] 00:20:31.552 }' 00:20:31.552 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.552 12:01:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.119 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:32.119 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:32.119 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:32.119 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:32.119 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:32.119 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:32.119 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:32.119 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:32.379 [2024-07-15 12:01:45.787056] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:32.379 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:32.379 "name": "raid_bdev1", 00:20:32.379 "aliases": [ 00:20:32.379 "7b4987e9-d684-41d4-a73c-082b014c4101" 00:20:32.379 ], 00:20:32.379 "product_name": "Raid Volume", 00:20:32.379 "block_size": 512, 00:20:32.379 "num_blocks": 253952, 00:20:32.379 "uuid": "7b4987e9-d684-41d4-a73c-082b014c4101", 00:20:32.379 "assigned_rate_limits": { 00:20:32.379 "rw_ios_per_sec": 0, 00:20:32.379 "rw_mbytes_per_sec": 0, 00:20:32.379 "r_mbytes_per_sec": 0, 00:20:32.379 "w_mbytes_per_sec": 0 00:20:32.379 }, 00:20:32.379 "claimed": false, 00:20:32.379 "zoned": false, 00:20:32.379 "supported_io_types": { 00:20:32.379 "read": true, 00:20:32.379 "write": true, 00:20:32.379 "unmap": true, 00:20:32.379 "flush": true, 00:20:32.379 "reset": true, 00:20:32.379 "nvme_admin": false, 00:20:32.379 "nvme_io": false, 00:20:32.379 "nvme_io_md": false, 00:20:32.379 "write_zeroes": true, 00:20:32.379 "zcopy": false, 00:20:32.379 "get_zone_info": false, 00:20:32.379 "zone_management": false, 00:20:32.379 "zone_append": false, 00:20:32.379 "compare": false, 00:20:32.379 "compare_and_write": false, 00:20:32.379 "abort": false, 00:20:32.379 "seek_hole": false, 00:20:32.379 "seek_data": false, 00:20:32.379 "copy": false, 00:20:32.379 "nvme_iov_md": false 00:20:32.379 }, 00:20:32.379 "memory_domains": [ 00:20:32.379 { 00:20:32.379 "dma_device_id": "system", 00:20:32.379 "dma_device_type": 1 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.379 "dma_device_type": 2 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "dma_device_id": "system", 00:20:32.379 "dma_device_type": 1 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.379 "dma_device_type": 2 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "dma_device_id": "system", 00:20:32.379 "dma_device_type": 1 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.379 "dma_device_type": 2 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "dma_device_id": "system", 00:20:32.379 "dma_device_type": 1 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.379 "dma_device_type": 2 00:20:32.379 } 00:20:32.379 ], 00:20:32.379 "driver_specific": { 00:20:32.379 "raid": { 00:20:32.379 "uuid": "7b4987e9-d684-41d4-a73c-082b014c4101", 00:20:32.379 "strip_size_kb": 64, 00:20:32.379 "state": "online", 00:20:32.379 "raid_level": "raid0", 00:20:32.379 "superblock": true, 00:20:32.379 "num_base_bdevs": 4, 00:20:32.379 "num_base_bdevs_discovered": 4, 00:20:32.379 "num_base_bdevs_operational": 4, 00:20:32.379 "base_bdevs_list": [ 00:20:32.379 { 00:20:32.379 "name": "pt1", 00:20:32.379 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:32.379 "is_configured": true, 00:20:32.379 "data_offset": 2048, 00:20:32.379 "data_size": 63488 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "name": "pt2", 00:20:32.379 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:32.379 "is_configured": true, 00:20:32.379 "data_offset": 2048, 00:20:32.379 "data_size": 63488 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "name": "pt3", 00:20:32.379 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:32.379 "is_configured": true, 00:20:32.379 "data_offset": 2048, 00:20:32.379 "data_size": 63488 00:20:32.379 }, 00:20:32.379 { 00:20:32.379 "name": "pt4", 00:20:32.379 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:32.379 "is_configured": true, 00:20:32.379 "data_offset": 2048, 00:20:32.379 "data_size": 63488 00:20:32.379 } 00:20:32.379 ] 00:20:32.379 } 00:20:32.379 } 00:20:32.379 }' 00:20:32.379 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:32.379 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:32.379 pt2 00:20:32.379 pt3 00:20:32.379 pt4' 00:20:32.379 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:32.379 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:32.379 12:01:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:32.639 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:32.639 "name": "pt1", 00:20:32.639 "aliases": [ 00:20:32.639 "00000000-0000-0000-0000-000000000001" 00:20:32.639 ], 00:20:32.639 "product_name": "passthru", 00:20:32.639 "block_size": 512, 00:20:32.639 "num_blocks": 65536, 00:20:32.639 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:32.639 "assigned_rate_limits": { 00:20:32.639 "rw_ios_per_sec": 0, 00:20:32.639 "rw_mbytes_per_sec": 0, 00:20:32.639 "r_mbytes_per_sec": 0, 00:20:32.639 "w_mbytes_per_sec": 0 00:20:32.639 }, 00:20:32.639 "claimed": true, 00:20:32.639 "claim_type": "exclusive_write", 00:20:32.639 "zoned": false, 00:20:32.639 "supported_io_types": { 00:20:32.639 "read": true, 00:20:32.639 "write": true, 00:20:32.639 "unmap": true, 00:20:32.639 "flush": true, 00:20:32.639 "reset": true, 00:20:32.639 "nvme_admin": false, 00:20:32.639 "nvme_io": false, 00:20:32.639 "nvme_io_md": false, 00:20:32.639 "write_zeroes": true, 00:20:32.639 "zcopy": true, 00:20:32.639 "get_zone_info": false, 00:20:32.639 "zone_management": false, 00:20:32.639 "zone_append": false, 00:20:32.639 "compare": false, 00:20:32.639 "compare_and_write": false, 00:20:32.639 "abort": true, 00:20:32.639 "seek_hole": false, 00:20:32.639 "seek_data": false, 00:20:32.639 "copy": true, 00:20:32.639 "nvme_iov_md": false 00:20:32.639 }, 00:20:32.639 "memory_domains": [ 00:20:32.639 { 00:20:32.639 "dma_device_id": "system", 00:20:32.639 "dma_device_type": 1 00:20:32.639 }, 00:20:32.639 { 00:20:32.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:32.639 "dma_device_type": 2 00:20:32.639 } 00:20:32.639 ], 00:20:32.639 "driver_specific": { 00:20:32.639 "passthru": { 00:20:32.639 "name": "pt1", 00:20:32.639 "base_bdev_name": "malloc1" 00:20:32.639 } 00:20:32.639 } 00:20:32.639 }' 00:20:32.639 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.639 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:32.897 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:32.897 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.897 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:32.897 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:32.897 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.897 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:32.897 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:32.897 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:32.897 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.156 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:33.156 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:33.156 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:33.156 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:33.415 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:33.415 "name": "pt2", 00:20:33.415 "aliases": [ 00:20:33.415 "00000000-0000-0000-0000-000000000002" 00:20:33.415 ], 00:20:33.415 "product_name": "passthru", 00:20:33.415 "block_size": 512, 00:20:33.415 "num_blocks": 65536, 00:20:33.415 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:33.415 "assigned_rate_limits": { 00:20:33.415 "rw_ios_per_sec": 0, 00:20:33.415 "rw_mbytes_per_sec": 0, 00:20:33.415 "r_mbytes_per_sec": 0, 00:20:33.415 "w_mbytes_per_sec": 0 00:20:33.415 }, 00:20:33.415 "claimed": true, 00:20:33.415 "claim_type": "exclusive_write", 00:20:33.415 "zoned": false, 00:20:33.415 "supported_io_types": { 00:20:33.415 "read": true, 00:20:33.415 "write": true, 00:20:33.415 "unmap": true, 00:20:33.415 "flush": true, 00:20:33.415 "reset": true, 00:20:33.415 "nvme_admin": false, 00:20:33.415 "nvme_io": false, 00:20:33.415 "nvme_io_md": false, 00:20:33.415 "write_zeroes": true, 00:20:33.415 "zcopy": true, 00:20:33.415 "get_zone_info": false, 00:20:33.415 "zone_management": false, 00:20:33.415 "zone_append": false, 00:20:33.415 "compare": false, 00:20:33.415 "compare_and_write": false, 00:20:33.415 "abort": true, 00:20:33.415 "seek_hole": false, 00:20:33.416 "seek_data": false, 00:20:33.416 "copy": true, 00:20:33.416 "nvme_iov_md": false 00:20:33.416 }, 00:20:33.416 "memory_domains": [ 00:20:33.416 { 00:20:33.416 "dma_device_id": "system", 00:20:33.416 "dma_device_type": 1 00:20:33.416 }, 00:20:33.416 { 00:20:33.416 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:33.416 "dma_device_type": 2 00:20:33.416 } 00:20:33.416 ], 00:20:33.416 "driver_specific": { 00:20:33.416 "passthru": { 00:20:33.416 "name": "pt2", 00:20:33.416 "base_bdev_name": "malloc2" 00:20:33.416 } 00:20:33.416 } 00:20:33.416 }' 00:20:33.416 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:33.416 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:33.416 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:33.416 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:33.416 12:01:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:33.674 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:33.674 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:33.674 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:33.674 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:33.674 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.674 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:33.932 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:33.932 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:33.932 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:33.932 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:34.191 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:34.191 "name": "pt3", 00:20:34.191 "aliases": [ 00:20:34.191 "00000000-0000-0000-0000-000000000003" 00:20:34.191 ], 00:20:34.191 "product_name": "passthru", 00:20:34.191 "block_size": 512, 00:20:34.191 "num_blocks": 65536, 00:20:34.191 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:34.191 "assigned_rate_limits": { 00:20:34.191 "rw_ios_per_sec": 0, 00:20:34.191 "rw_mbytes_per_sec": 0, 00:20:34.191 "r_mbytes_per_sec": 0, 00:20:34.191 "w_mbytes_per_sec": 0 00:20:34.191 }, 00:20:34.191 "claimed": true, 00:20:34.191 "claim_type": "exclusive_write", 00:20:34.191 "zoned": false, 00:20:34.191 "supported_io_types": { 00:20:34.191 "read": true, 00:20:34.191 "write": true, 00:20:34.191 "unmap": true, 00:20:34.191 "flush": true, 00:20:34.191 "reset": true, 00:20:34.191 "nvme_admin": false, 00:20:34.191 "nvme_io": false, 00:20:34.191 "nvme_io_md": false, 00:20:34.191 "write_zeroes": true, 00:20:34.191 "zcopy": true, 00:20:34.191 "get_zone_info": false, 00:20:34.191 "zone_management": false, 00:20:34.191 "zone_append": false, 00:20:34.191 "compare": false, 00:20:34.191 "compare_and_write": false, 00:20:34.191 "abort": true, 00:20:34.191 "seek_hole": false, 00:20:34.191 "seek_data": false, 00:20:34.191 "copy": true, 00:20:34.191 "nvme_iov_md": false 00:20:34.191 }, 00:20:34.191 "memory_domains": [ 00:20:34.191 { 00:20:34.191 "dma_device_id": "system", 00:20:34.191 "dma_device_type": 1 00:20:34.191 }, 00:20:34.191 { 00:20:34.191 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.191 "dma_device_type": 2 00:20:34.191 } 00:20:34.191 ], 00:20:34.191 "driver_specific": { 00:20:34.191 "passthru": { 00:20:34.191 "name": "pt3", 00:20:34.191 "base_bdev_name": "malloc3" 00:20:34.191 } 00:20:34.191 } 00:20:34.191 }' 00:20:34.191 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.191 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:34.191 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:34.191 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.191 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:34.451 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:34.451 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:34.451 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:34.451 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:34.451 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:34.451 12:01:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:34.451 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:34.451 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:34.451 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:34.451 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:35.019 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:35.019 "name": "pt4", 00:20:35.019 "aliases": [ 00:20:35.019 "00000000-0000-0000-0000-000000000004" 00:20:35.019 ], 00:20:35.019 "product_name": "passthru", 00:20:35.019 "block_size": 512, 00:20:35.019 "num_blocks": 65536, 00:20:35.019 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:35.020 "assigned_rate_limits": { 00:20:35.020 "rw_ios_per_sec": 0, 00:20:35.020 "rw_mbytes_per_sec": 0, 00:20:35.020 "r_mbytes_per_sec": 0, 00:20:35.020 "w_mbytes_per_sec": 0 00:20:35.020 }, 00:20:35.020 "claimed": true, 00:20:35.020 "claim_type": "exclusive_write", 00:20:35.020 "zoned": false, 00:20:35.020 "supported_io_types": { 00:20:35.020 "read": true, 00:20:35.020 "write": true, 00:20:35.020 "unmap": true, 00:20:35.020 "flush": true, 00:20:35.020 "reset": true, 00:20:35.020 "nvme_admin": false, 00:20:35.020 "nvme_io": false, 00:20:35.020 "nvme_io_md": false, 00:20:35.020 "write_zeroes": true, 00:20:35.020 "zcopy": true, 00:20:35.020 "get_zone_info": false, 00:20:35.020 "zone_management": false, 00:20:35.020 "zone_append": false, 00:20:35.020 "compare": false, 00:20:35.020 "compare_and_write": false, 00:20:35.020 "abort": true, 00:20:35.020 "seek_hole": false, 00:20:35.020 "seek_data": false, 00:20:35.020 "copy": true, 00:20:35.020 "nvme_iov_md": false 00:20:35.020 }, 00:20:35.020 "memory_domains": [ 00:20:35.020 { 00:20:35.020 "dma_device_id": "system", 00:20:35.020 "dma_device_type": 1 00:20:35.020 }, 00:20:35.020 { 00:20:35.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.020 "dma_device_type": 2 00:20:35.020 } 00:20:35.020 ], 00:20:35.020 "driver_specific": { 00:20:35.020 "passthru": { 00:20:35.020 "name": "pt4", 00:20:35.020 "base_bdev_name": "malloc4" 00:20:35.020 } 00:20:35.020 } 00:20:35.020 }' 00:20:35.020 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.020 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:35.278 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:35.278 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.278 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:35.278 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:35.278 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.278 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:35.278 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:35.278 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.536 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:35.536 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:35.536 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:35.536 12:01:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:20:35.793 [2024-07-15 12:01:49.151989] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 7b4987e9-d684-41d4-a73c-082b014c4101 '!=' 7b4987e9-d684-41d4-a73c-082b014c4101 ']' 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1527299 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1527299 ']' 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1527299 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1527299 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1527299' 00:20:35.793 killing process with pid 1527299 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1527299 00:20:35.793 [2024-07-15 12:01:49.229013] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:35.793 [2024-07-15 12:01:49.229074] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:35.793 [2024-07-15 12:01:49.229137] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:35.793 [2024-07-15 12:01:49.229149] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd5bf60 name raid_bdev1, state offline 00:20:35.793 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1527299 00:20:35.793 [2024-07-15 12:01:49.266010] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:36.053 12:01:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:20:36.053 00:20:36.053 real 0m17.140s 00:20:36.053 user 0m31.041s 00:20:36.053 sys 0m3.121s 00:20:36.053 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:36.053 12:01:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:36.053 ************************************ 00:20:36.053 END TEST raid_superblock_test 00:20:36.053 ************************************ 00:20:36.053 12:01:49 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:36.053 12:01:49 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:20:36.053 12:01:49 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:36.053 12:01:49 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:36.053 12:01:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:36.053 ************************************ 00:20:36.053 START TEST raid_read_error_test 00:20:36.053 ************************************ 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 read 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.fY29U5DqxY 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1529787 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1529787 /var/tmp/spdk-raid.sock 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1529787 ']' 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:36.053 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:36.053 12:01:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:36.053 [2024-07-15 12:01:49.631619] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:20:36.053 [2024-07-15 12:01:49.631693] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1529787 ] 00:20:36.312 [2024-07-15 12:01:49.759585] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:36.312 [2024-07-15 12:01:49.865657] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:36.572 [2024-07-15 12:01:49.936577] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:36.572 [2024-07-15 12:01:49.936616] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:37.141 12:01:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:37.141 12:01:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:37.141 12:01:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:37.141 12:01:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:37.400 BaseBdev1_malloc 00:20:37.400 12:01:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:37.657 true 00:20:37.658 12:01:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:37.916 [2024-07-15 12:01:51.272157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:37.916 [2024-07-15 12:01:51.272203] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:37.916 [2024-07-15 12:01:51.272222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23874e0 00:20:37.916 [2024-07-15 12:01:51.272235] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:37.916 [2024-07-15 12:01:51.274033] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:37.916 [2024-07-15 12:01:51.274063] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:37.916 BaseBdev1 00:20:37.916 12:01:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:37.916 12:01:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:38.175 BaseBdev2_malloc 00:20:38.175 12:01:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:38.175 true 00:20:38.434 12:01:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:38.434 [2024-07-15 12:01:52.007440] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:38.434 [2024-07-15 12:01:52.007484] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:38.434 [2024-07-15 12:01:52.007504] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238c7b0 00:20:38.434 [2024-07-15 12:01:52.007517] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:38.434 [2024-07-15 12:01:52.009164] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:38.434 [2024-07-15 12:01:52.009195] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:38.434 BaseBdev2 00:20:38.434 12:01:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:38.434 12:01:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:38.745 BaseBdev3_malloc 00:20:38.745 12:01:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:39.035 true 00:20:39.035 12:01:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:39.294 [2024-07-15 12:01:52.731096] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:39.294 [2024-07-15 12:01:52.731142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:39.294 [2024-07-15 12:01:52.731163] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x238e8f0 00:20:39.294 [2024-07-15 12:01:52.731176] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:39.294 [2024-07-15 12:01:52.732793] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:39.294 [2024-07-15 12:01:52.732821] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:39.294 BaseBdev3 00:20:39.294 12:01:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:39.294 12:01:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:39.554 BaseBdev4_malloc 00:20:39.554 12:01:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:39.814 true 00:20:39.814 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:40.073 [2024-07-15 12:01:53.453536] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:40.073 [2024-07-15 12:01:53.453583] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:40.073 [2024-07-15 12:01:53.453602] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2390dc0 00:20:40.073 [2024-07-15 12:01:53.453615] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:40.073 [2024-07-15 12:01:53.455258] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:40.073 [2024-07-15 12:01:53.455287] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:40.073 BaseBdev4 00:20:40.073 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:40.332 [2024-07-15 12:01:53.686186] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:40.332 [2024-07-15 12:01:53.687508] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:40.332 [2024-07-15 12:01:53.687578] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:40.332 [2024-07-15 12:01:53.687636] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:40.332 [2024-07-15 12:01:53.687877] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x238f090 00:20:40.332 [2024-07-15 12:01:53.687889] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:40.332 [2024-07-15 12:01:53.688088] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2391bf0 00:20:40.332 [2024-07-15 12:01:53.688241] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x238f090 00:20:40.332 [2024-07-15 12:01:53.688250] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x238f090 00:20:40.332 [2024-07-15 12:01:53.688353] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.332 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:40.592 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.592 "name": "raid_bdev1", 00:20:40.592 "uuid": "b7b20979-7823-4474-b14a-d0508702d36b", 00:20:40.592 "strip_size_kb": 64, 00:20:40.592 "state": "online", 00:20:40.592 "raid_level": "raid0", 00:20:40.592 "superblock": true, 00:20:40.592 "num_base_bdevs": 4, 00:20:40.592 "num_base_bdevs_discovered": 4, 00:20:40.592 "num_base_bdevs_operational": 4, 00:20:40.592 "base_bdevs_list": [ 00:20:40.592 { 00:20:40.592 "name": "BaseBdev1", 00:20:40.592 "uuid": "d07853c4-b82e-57e9-a2d1-f43d7c54bd1f", 00:20:40.592 "is_configured": true, 00:20:40.592 "data_offset": 2048, 00:20:40.592 "data_size": 63488 00:20:40.592 }, 00:20:40.592 { 00:20:40.592 "name": "BaseBdev2", 00:20:40.592 "uuid": "069cef44-c8a5-572d-a2ed-35feb452b97a", 00:20:40.592 "is_configured": true, 00:20:40.592 "data_offset": 2048, 00:20:40.592 "data_size": 63488 00:20:40.592 }, 00:20:40.592 { 00:20:40.592 "name": "BaseBdev3", 00:20:40.592 "uuid": "79935369-e71e-5457-9a8b-88f5ecc85eb8", 00:20:40.592 "is_configured": true, 00:20:40.592 "data_offset": 2048, 00:20:40.592 "data_size": 63488 00:20:40.592 }, 00:20:40.592 { 00:20:40.592 "name": "BaseBdev4", 00:20:40.592 "uuid": "174a3f4e-2712-50a4-bcab-d5900bae693e", 00:20:40.592 "is_configured": true, 00:20:40.592 "data_offset": 2048, 00:20:40.592 "data_size": 63488 00:20:40.592 } 00:20:40.592 ] 00:20:40.592 }' 00:20:40.592 12:01:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.592 12:01:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:41.161 12:01:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:41.161 12:01:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:41.420 [2024-07-15 12:01:54.953826] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2391a40 00:20:42.358 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.617 12:01:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:43.186 12:01:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:43.186 "name": "raid_bdev1", 00:20:43.186 "uuid": "b7b20979-7823-4474-b14a-d0508702d36b", 00:20:43.186 "strip_size_kb": 64, 00:20:43.186 "state": "online", 00:20:43.186 "raid_level": "raid0", 00:20:43.186 "superblock": true, 00:20:43.186 "num_base_bdevs": 4, 00:20:43.186 "num_base_bdevs_discovered": 4, 00:20:43.186 "num_base_bdevs_operational": 4, 00:20:43.186 "base_bdevs_list": [ 00:20:43.186 { 00:20:43.186 "name": "BaseBdev1", 00:20:43.186 "uuid": "d07853c4-b82e-57e9-a2d1-f43d7c54bd1f", 00:20:43.186 "is_configured": true, 00:20:43.186 "data_offset": 2048, 00:20:43.186 "data_size": 63488 00:20:43.186 }, 00:20:43.186 { 00:20:43.186 "name": "BaseBdev2", 00:20:43.186 "uuid": "069cef44-c8a5-572d-a2ed-35feb452b97a", 00:20:43.186 "is_configured": true, 00:20:43.186 "data_offset": 2048, 00:20:43.186 "data_size": 63488 00:20:43.186 }, 00:20:43.186 { 00:20:43.186 "name": "BaseBdev3", 00:20:43.186 "uuid": "79935369-e71e-5457-9a8b-88f5ecc85eb8", 00:20:43.186 "is_configured": true, 00:20:43.186 "data_offset": 2048, 00:20:43.186 "data_size": 63488 00:20:43.186 }, 00:20:43.186 { 00:20:43.186 "name": "BaseBdev4", 00:20:43.186 "uuid": "174a3f4e-2712-50a4-bcab-d5900bae693e", 00:20:43.186 "is_configured": true, 00:20:43.186 "data_offset": 2048, 00:20:43.186 "data_size": 63488 00:20:43.186 } 00:20:43.186 ] 00:20:43.186 }' 00:20:43.186 12:01:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:43.186 12:01:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:43.755 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:44.014 [2024-07-15 12:01:57.519377] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:44.014 [2024-07-15 12:01:57.519413] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:44.014 [2024-07-15 12:01:57.522605] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:44.014 [2024-07-15 12:01:57.522644] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:44.014 [2024-07-15 12:01:57.522681] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:44.014 [2024-07-15 12:01:57.522700] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x238f090 name raid_bdev1, state offline 00:20:44.014 0 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1529787 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1529787 ']' 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1529787 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1529787 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1529787' 00:20:44.014 killing process with pid 1529787 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1529787 00:20:44.014 [2024-07-15 12:01:57.591630] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:44.014 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1529787 00:20:44.273 [2024-07-15 12:01:57.627249] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:44.273 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.fY29U5DqxY 00:20:44.273 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:44.273 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:44.533 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.39 00:20:44.533 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:20:44.533 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:44.533 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:44.533 12:01:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.39 != \0\.\0\0 ]] 00:20:44.533 00:20:44.533 real 0m8.321s 00:20:44.533 user 0m13.605s 00:20:44.533 sys 0m1.408s 00:20:44.533 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:44.533 12:01:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:44.533 ************************************ 00:20:44.533 END TEST raid_read_error_test 00:20:44.533 ************************************ 00:20:44.533 12:01:57 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:44.533 12:01:57 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:20:44.533 12:01:57 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:44.533 12:01:57 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:44.533 12:01:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:44.533 ************************************ 00:20:44.533 START TEST raid_write_error_test 00:20:44.533 ************************************ 00:20:44.533 12:01:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid0 4 write 00:20:44.533 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:20:44.533 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RcPsQJncKE 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1530946 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1530946 /var/tmp/spdk-raid.sock 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1530946 ']' 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:44.534 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:44.534 12:01:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:44.534 [2024-07-15 12:01:58.036648] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:20:44.534 [2024-07-15 12:01:58.036729] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1530946 ] 00:20:44.793 [2024-07-15 12:01:58.167581] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:44.793 [2024-07-15 12:01:58.273835] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:44.793 [2024-07-15 12:01:58.341790] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:44.793 [2024-07-15 12:01:58.341833] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:45.732 12:01:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:45.732 12:01:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:20:45.732 12:01:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:45.732 12:01:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:45.732 BaseBdev1_malloc 00:20:45.732 12:01:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:45.993 true 00:20:45.993 12:01:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:46.253 [2024-07-15 12:01:59.783984] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:46.253 [2024-07-15 12:01:59.784028] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:46.253 [2024-07-15 12:01:59.784049] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28f14e0 00:20:46.253 [2024-07-15 12:01:59.784062] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:46.253 [2024-07-15 12:01:59.785893] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:46.253 [2024-07-15 12:01:59.785926] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:46.253 BaseBdev1 00:20:46.253 12:01:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:46.253 12:01:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:46.512 BaseBdev2_malloc 00:20:46.512 12:02:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:46.772 true 00:20:46.772 12:02:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:47.030 [2024-07-15 12:02:00.571996] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:47.030 [2024-07-15 12:02:00.572042] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.030 [2024-07-15 12:02:00.572062] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28f67b0 00:20:47.030 [2024-07-15 12:02:00.572075] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.030 [2024-07-15 12:02:00.573638] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.030 [2024-07-15 12:02:00.573666] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:47.030 BaseBdev2 00:20:47.030 12:02:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:47.030 12:02:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:47.290 BaseBdev3_malloc 00:20:47.549 12:02:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:47.549 true 00:20:47.549 12:02:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:47.808 [2024-07-15 12:02:01.398743] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:47.808 [2024-07-15 12:02:01.398790] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.808 [2024-07-15 12:02:01.398812] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28f88f0 00:20:47.808 [2024-07-15 12:02:01.398823] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.808 [2024-07-15 12:02:01.400403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.808 [2024-07-15 12:02:01.400433] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:47.808 BaseBdev3 00:20:48.067 12:02:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:20:48.067 12:02:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:48.067 BaseBdev4_malloc 00:20:48.326 12:02:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:48.585 true 00:20:48.585 12:02:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:48.845 [2024-07-15 12:02:02.186621] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:48.845 [2024-07-15 12:02:02.186665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:48.845 [2024-07-15 12:02:02.186692] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28fadc0 00:20:48.845 [2024-07-15 12:02:02.186705] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:48.845 [2024-07-15 12:02:02.188277] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:48.845 [2024-07-15 12:02:02.188305] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:48.845 BaseBdev4 00:20:48.845 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:48.845 [2024-07-15 12:02:02.419273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:48.845 [2024-07-15 12:02:02.420564] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:48.845 [2024-07-15 12:02:02.420633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:48.845 [2024-07-15 12:02:02.420702] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:48.845 [2024-07-15 12:02:02.420935] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28f9090 00:20:48.845 [2024-07-15 12:02:02.420946] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:48.845 [2024-07-15 12:02:02.421144] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28fbbf0 00:20:48.845 [2024-07-15 12:02:02.421291] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28f9090 00:20:48.845 [2024-07-15 12:02:02.421302] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28f9090 00:20:48.845 [2024-07-15 12:02:02.421405] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:48.845 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:48.845 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:48.845 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.845 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:48.845 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:48.845 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:49.103 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:49.103 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:49.103 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:49.103 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:49.103 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.103 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:49.363 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.363 "name": "raid_bdev1", 00:20:49.363 "uuid": "89dddf3f-81a2-4249-9eee-f09ac587998b", 00:20:49.363 "strip_size_kb": 64, 00:20:49.363 "state": "online", 00:20:49.363 "raid_level": "raid0", 00:20:49.363 "superblock": true, 00:20:49.363 "num_base_bdevs": 4, 00:20:49.363 "num_base_bdevs_discovered": 4, 00:20:49.363 "num_base_bdevs_operational": 4, 00:20:49.363 "base_bdevs_list": [ 00:20:49.363 { 00:20:49.363 "name": "BaseBdev1", 00:20:49.363 "uuid": "4423c29f-027d-5017-9e65-226a04b27d1f", 00:20:49.363 "is_configured": true, 00:20:49.363 "data_offset": 2048, 00:20:49.363 "data_size": 63488 00:20:49.363 }, 00:20:49.363 { 00:20:49.363 "name": "BaseBdev2", 00:20:49.363 "uuid": "a621c7e9-1443-5e97-9b1a-7389a3871cfb", 00:20:49.363 "is_configured": true, 00:20:49.363 "data_offset": 2048, 00:20:49.363 "data_size": 63488 00:20:49.363 }, 00:20:49.363 { 00:20:49.363 "name": "BaseBdev3", 00:20:49.363 "uuid": "7c03f6f6-c6b6-5f2a-9b44-3c00154cbbb2", 00:20:49.363 "is_configured": true, 00:20:49.363 "data_offset": 2048, 00:20:49.363 "data_size": 63488 00:20:49.363 }, 00:20:49.363 { 00:20:49.363 "name": "BaseBdev4", 00:20:49.363 "uuid": "656a38fa-f444-5d58-b8ce-3b3f88613cc9", 00:20:49.363 "is_configured": true, 00:20:49.363 "data_offset": 2048, 00:20:49.363 "data_size": 63488 00:20:49.363 } 00:20:49.363 ] 00:20:49.363 }' 00:20:49.363 12:02:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.363 12:02:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.298 12:02:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:20:50.298 12:02:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:50.298 [2024-07-15 12:02:03.674859] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28fba40 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.236 12:02:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:51.495 12:02:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:51.495 "name": "raid_bdev1", 00:20:51.495 "uuid": "89dddf3f-81a2-4249-9eee-f09ac587998b", 00:20:51.495 "strip_size_kb": 64, 00:20:51.495 "state": "online", 00:20:51.495 "raid_level": "raid0", 00:20:51.495 "superblock": true, 00:20:51.495 "num_base_bdevs": 4, 00:20:51.495 "num_base_bdevs_discovered": 4, 00:20:51.495 "num_base_bdevs_operational": 4, 00:20:51.495 "base_bdevs_list": [ 00:20:51.495 { 00:20:51.495 "name": "BaseBdev1", 00:20:51.495 "uuid": "4423c29f-027d-5017-9e65-226a04b27d1f", 00:20:51.495 "is_configured": true, 00:20:51.495 "data_offset": 2048, 00:20:51.495 "data_size": 63488 00:20:51.495 }, 00:20:51.495 { 00:20:51.495 "name": "BaseBdev2", 00:20:51.495 "uuid": "a621c7e9-1443-5e97-9b1a-7389a3871cfb", 00:20:51.495 "is_configured": true, 00:20:51.495 "data_offset": 2048, 00:20:51.495 "data_size": 63488 00:20:51.495 }, 00:20:51.495 { 00:20:51.495 "name": "BaseBdev3", 00:20:51.495 "uuid": "7c03f6f6-c6b6-5f2a-9b44-3c00154cbbb2", 00:20:51.495 "is_configured": true, 00:20:51.495 "data_offset": 2048, 00:20:51.495 "data_size": 63488 00:20:51.495 }, 00:20:51.495 { 00:20:51.495 "name": "BaseBdev4", 00:20:51.495 "uuid": "656a38fa-f444-5d58-b8ce-3b3f88613cc9", 00:20:51.495 "is_configured": true, 00:20:51.495 "data_offset": 2048, 00:20:51.495 "data_size": 63488 00:20:51.495 } 00:20:51.495 ] 00:20:51.495 }' 00:20:51.495 12:02:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:51.495 12:02:05 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:52.436 12:02:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:52.696 [2024-07-15 12:02:06.062188] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:52.696 [2024-07-15 12:02:06.062224] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:52.696 [2024-07-15 12:02:06.065552] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:52.697 [2024-07-15 12:02:06.065589] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:52.697 [2024-07-15 12:02:06.065627] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:52.697 [2024-07-15 12:02:06.065638] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28f9090 name raid_bdev1, state offline 00:20:52.697 0 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1530946 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1530946 ']' 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1530946 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1530946 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1530946' 00:20:52.697 killing process with pid 1530946 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1530946 00:20:52.697 [2024-07-15 12:02:06.147087] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:52.697 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1530946 00:20:52.697 [2024-07-15 12:02:06.178872] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RcPsQJncKE 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.42 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.42 != \0\.\0\0 ]] 00:20:52.956 00:20:52.956 real 0m8.453s 00:20:52.956 user 0m13.823s 00:20:52.956 sys 0m1.416s 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:20:52.956 12:02:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:52.956 ************************************ 00:20:52.956 END TEST raid_write_error_test 00:20:52.956 ************************************ 00:20:52.956 12:02:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:20:52.956 12:02:06 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:20:52.956 12:02:06 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:20:52.956 12:02:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:20:52.956 12:02:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:20:52.956 12:02:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:52.956 ************************************ 00:20:52.956 START TEST raid_state_function_test 00:20:52.956 ************************************ 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 false 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:52.956 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1532101 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1532101' 00:20:52.957 Process raid pid: 1532101 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1532101 /var/tmp/spdk-raid.sock 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1532101 ']' 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:52.957 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:20:52.957 12:02:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:53.216 [2024-07-15 12:02:06.570522] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:20:53.216 [2024-07-15 12:02:06.570578] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:53.216 [2024-07-15 12:02:06.683360] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:53.216 [2024-07-15 12:02:06.790850] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:53.475 [2024-07-15 12:02:06.852772] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:53.475 [2024-07-15 12:02:06.852809] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:54.044 12:02:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:20:54.044 12:02:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:20:54.044 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:54.044 [2024-07-15 12:02:07.598656] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:54.044 [2024-07-15 12:02:07.598708] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:54.044 [2024-07-15 12:02:07.598719] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:54.045 [2024-07-15 12:02:07.598732] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:54.045 [2024-07-15 12:02:07.598740] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:54.045 [2024-07-15 12:02:07.598752] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:54.045 [2024-07-15 12:02:07.598763] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:54.045 [2024-07-15 12:02:07.598775] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.045 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:54.304 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.304 "name": "Existed_Raid", 00:20:54.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.304 "strip_size_kb": 64, 00:20:54.304 "state": "configuring", 00:20:54.304 "raid_level": "concat", 00:20:54.304 "superblock": false, 00:20:54.304 "num_base_bdevs": 4, 00:20:54.304 "num_base_bdevs_discovered": 0, 00:20:54.304 "num_base_bdevs_operational": 4, 00:20:54.304 "base_bdevs_list": [ 00:20:54.304 { 00:20:54.304 "name": "BaseBdev1", 00:20:54.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.304 "is_configured": false, 00:20:54.304 "data_offset": 0, 00:20:54.304 "data_size": 0 00:20:54.304 }, 00:20:54.304 { 00:20:54.304 "name": "BaseBdev2", 00:20:54.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.304 "is_configured": false, 00:20:54.304 "data_offset": 0, 00:20:54.304 "data_size": 0 00:20:54.304 }, 00:20:54.304 { 00:20:54.304 "name": "BaseBdev3", 00:20:54.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.304 "is_configured": false, 00:20:54.304 "data_offset": 0, 00:20:54.304 "data_size": 0 00:20:54.304 }, 00:20:54.304 { 00:20:54.304 "name": "BaseBdev4", 00:20:54.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:54.304 "is_configured": false, 00:20:54.304 "data_offset": 0, 00:20:54.304 "data_size": 0 00:20:54.304 } 00:20:54.304 ] 00:20:54.304 }' 00:20:54.304 12:02:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.304 12:02:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:54.872 12:02:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:55.131 [2024-07-15 12:02:08.625242] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:55.131 [2024-07-15 12:02:08.625276] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25e9b20 name Existed_Raid, state configuring 00:20:55.132 12:02:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:55.391 [2024-07-15 12:02:08.869903] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:55.391 [2024-07-15 12:02:08.869936] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:55.391 [2024-07-15 12:02:08.869945] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:55.391 [2024-07-15 12:02:08.869956] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:55.391 [2024-07-15 12:02:08.869965] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:55.391 [2024-07-15 12:02:08.869976] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:55.391 [2024-07-15 12:02:08.869985] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:55.391 [2024-07-15 12:02:08.869996] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:55.391 12:02:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:55.650 [2024-07-15 12:02:09.125640] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:55.650 BaseBdev1 00:20:55.650 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:55.650 12:02:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:20:55.650 12:02:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:55.650 12:02:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:55.650 12:02:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:55.650 12:02:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:55.650 12:02:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:55.909 12:02:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:56.169 [ 00:20:56.169 { 00:20:56.169 "name": "BaseBdev1", 00:20:56.169 "aliases": [ 00:20:56.169 "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7" 00:20:56.169 ], 00:20:56.169 "product_name": "Malloc disk", 00:20:56.169 "block_size": 512, 00:20:56.169 "num_blocks": 65536, 00:20:56.169 "uuid": "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7", 00:20:56.169 "assigned_rate_limits": { 00:20:56.169 "rw_ios_per_sec": 0, 00:20:56.169 "rw_mbytes_per_sec": 0, 00:20:56.169 "r_mbytes_per_sec": 0, 00:20:56.169 "w_mbytes_per_sec": 0 00:20:56.169 }, 00:20:56.169 "claimed": true, 00:20:56.169 "claim_type": "exclusive_write", 00:20:56.169 "zoned": false, 00:20:56.169 "supported_io_types": { 00:20:56.169 "read": true, 00:20:56.169 "write": true, 00:20:56.169 "unmap": true, 00:20:56.169 "flush": true, 00:20:56.169 "reset": true, 00:20:56.169 "nvme_admin": false, 00:20:56.169 "nvme_io": false, 00:20:56.169 "nvme_io_md": false, 00:20:56.169 "write_zeroes": true, 00:20:56.169 "zcopy": true, 00:20:56.169 "get_zone_info": false, 00:20:56.169 "zone_management": false, 00:20:56.169 "zone_append": false, 00:20:56.169 "compare": false, 00:20:56.169 "compare_and_write": false, 00:20:56.169 "abort": true, 00:20:56.169 "seek_hole": false, 00:20:56.169 "seek_data": false, 00:20:56.169 "copy": true, 00:20:56.169 "nvme_iov_md": false 00:20:56.169 }, 00:20:56.169 "memory_domains": [ 00:20:56.169 { 00:20:56.169 "dma_device_id": "system", 00:20:56.169 "dma_device_type": 1 00:20:56.169 }, 00:20:56.169 { 00:20:56.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:56.169 "dma_device_type": 2 00:20:56.169 } 00:20:56.169 ], 00:20:56.169 "driver_specific": {} 00:20:56.169 } 00:20:56.169 ] 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:56.169 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:56.428 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:56.428 "name": "Existed_Raid", 00:20:56.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.428 "strip_size_kb": 64, 00:20:56.428 "state": "configuring", 00:20:56.428 "raid_level": "concat", 00:20:56.428 "superblock": false, 00:20:56.428 "num_base_bdevs": 4, 00:20:56.428 "num_base_bdevs_discovered": 1, 00:20:56.428 "num_base_bdevs_operational": 4, 00:20:56.428 "base_bdevs_list": [ 00:20:56.428 { 00:20:56.428 "name": "BaseBdev1", 00:20:56.428 "uuid": "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7", 00:20:56.428 "is_configured": true, 00:20:56.428 "data_offset": 0, 00:20:56.428 "data_size": 65536 00:20:56.428 }, 00:20:56.428 { 00:20:56.428 "name": "BaseBdev2", 00:20:56.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.428 "is_configured": false, 00:20:56.428 "data_offset": 0, 00:20:56.428 "data_size": 0 00:20:56.428 }, 00:20:56.428 { 00:20:56.428 "name": "BaseBdev3", 00:20:56.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.428 "is_configured": false, 00:20:56.428 "data_offset": 0, 00:20:56.428 "data_size": 0 00:20:56.428 }, 00:20:56.428 { 00:20:56.428 "name": "BaseBdev4", 00:20:56.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:56.428 "is_configured": false, 00:20:56.428 "data_offset": 0, 00:20:56.428 "data_size": 0 00:20:56.428 } 00:20:56.428 ] 00:20:56.428 }' 00:20:56.428 12:02:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:56.428 12:02:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:56.996 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:57.254 [2024-07-15 12:02:10.701812] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:57.254 [2024-07-15 12:02:10.701858] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25e9390 name Existed_Raid, state configuring 00:20:57.254 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:57.512 [2024-07-15 12:02:10.946498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:57.512 [2024-07-15 12:02:10.948015] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:57.512 [2024-07-15 12:02:10.948054] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:57.512 [2024-07-15 12:02:10.948065] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:57.512 [2024-07-15 12:02:10.948077] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:57.512 [2024-07-15 12:02:10.948086] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:57.512 [2024-07-15 12:02:10.948098] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.512 12:02:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:57.770 12:02:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.770 "name": "Existed_Raid", 00:20:57.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.770 "strip_size_kb": 64, 00:20:57.770 "state": "configuring", 00:20:57.770 "raid_level": "concat", 00:20:57.770 "superblock": false, 00:20:57.770 "num_base_bdevs": 4, 00:20:57.770 "num_base_bdevs_discovered": 1, 00:20:57.770 "num_base_bdevs_operational": 4, 00:20:57.770 "base_bdevs_list": [ 00:20:57.770 { 00:20:57.770 "name": "BaseBdev1", 00:20:57.770 "uuid": "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7", 00:20:57.770 "is_configured": true, 00:20:57.770 "data_offset": 0, 00:20:57.770 "data_size": 65536 00:20:57.770 }, 00:20:57.770 { 00:20:57.770 "name": "BaseBdev2", 00:20:57.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.770 "is_configured": false, 00:20:57.770 "data_offset": 0, 00:20:57.770 "data_size": 0 00:20:57.770 }, 00:20:57.770 { 00:20:57.770 "name": "BaseBdev3", 00:20:57.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.770 "is_configured": false, 00:20:57.770 "data_offset": 0, 00:20:57.770 "data_size": 0 00:20:57.770 }, 00:20:57.770 { 00:20:57.770 "name": "BaseBdev4", 00:20:57.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:57.770 "is_configured": false, 00:20:57.770 "data_offset": 0, 00:20:57.770 "data_size": 0 00:20:57.770 } 00:20:57.770 ] 00:20:57.770 }' 00:20:57.770 12:02:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.770 12:02:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:58.337 12:02:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:58.595 [2024-07-15 12:02:12.037076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:58.595 BaseBdev2 00:20:58.595 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:58.595 12:02:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:20:58.595 12:02:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:58.595 12:02:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:58.595 12:02:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:58.595 12:02:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:58.595 12:02:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:58.853 [ 00:20:58.853 { 00:20:58.853 "name": "BaseBdev2", 00:20:58.853 "aliases": [ 00:20:58.853 "1bd676db-630e-4496-982d-808ca0a61e75" 00:20:58.853 ], 00:20:58.853 "product_name": "Malloc disk", 00:20:58.853 "block_size": 512, 00:20:58.853 "num_blocks": 65536, 00:20:58.853 "uuid": "1bd676db-630e-4496-982d-808ca0a61e75", 00:20:58.853 "assigned_rate_limits": { 00:20:58.853 "rw_ios_per_sec": 0, 00:20:58.853 "rw_mbytes_per_sec": 0, 00:20:58.853 "r_mbytes_per_sec": 0, 00:20:58.853 "w_mbytes_per_sec": 0 00:20:58.853 }, 00:20:58.853 "claimed": true, 00:20:58.853 "claim_type": "exclusive_write", 00:20:58.853 "zoned": false, 00:20:58.853 "supported_io_types": { 00:20:58.853 "read": true, 00:20:58.853 "write": true, 00:20:58.853 "unmap": true, 00:20:58.853 "flush": true, 00:20:58.853 "reset": true, 00:20:58.853 "nvme_admin": false, 00:20:58.853 "nvme_io": false, 00:20:58.853 "nvme_io_md": false, 00:20:58.853 "write_zeroes": true, 00:20:58.853 "zcopy": true, 00:20:58.853 "get_zone_info": false, 00:20:58.853 "zone_management": false, 00:20:58.853 "zone_append": false, 00:20:58.853 "compare": false, 00:20:58.853 "compare_and_write": false, 00:20:58.853 "abort": true, 00:20:58.853 "seek_hole": false, 00:20:58.853 "seek_data": false, 00:20:58.853 "copy": true, 00:20:58.853 "nvme_iov_md": false 00:20:58.853 }, 00:20:58.853 "memory_domains": [ 00:20:58.853 { 00:20:58.853 "dma_device_id": "system", 00:20:58.853 "dma_device_type": 1 00:20:58.853 }, 00:20:58.853 { 00:20:58.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.853 "dma_device_type": 2 00:20:58.853 } 00:20:58.853 ], 00:20:58.853 "driver_specific": {} 00:20:58.853 } 00:20:58.853 ] 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:58.853 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.111 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.111 "name": "Existed_Raid", 00:20:59.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.111 "strip_size_kb": 64, 00:20:59.111 "state": "configuring", 00:20:59.111 "raid_level": "concat", 00:20:59.111 "superblock": false, 00:20:59.111 "num_base_bdevs": 4, 00:20:59.111 "num_base_bdevs_discovered": 2, 00:20:59.111 "num_base_bdevs_operational": 4, 00:20:59.111 "base_bdevs_list": [ 00:20:59.111 { 00:20:59.111 "name": "BaseBdev1", 00:20:59.111 "uuid": "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7", 00:20:59.111 "is_configured": true, 00:20:59.111 "data_offset": 0, 00:20:59.111 "data_size": 65536 00:20:59.111 }, 00:20:59.111 { 00:20:59.111 "name": "BaseBdev2", 00:20:59.111 "uuid": "1bd676db-630e-4496-982d-808ca0a61e75", 00:20:59.111 "is_configured": true, 00:20:59.111 "data_offset": 0, 00:20:59.111 "data_size": 65536 00:20:59.111 }, 00:20:59.111 { 00:20:59.111 "name": "BaseBdev3", 00:20:59.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.111 "is_configured": false, 00:20:59.111 "data_offset": 0, 00:20:59.111 "data_size": 0 00:20:59.111 }, 00:20:59.111 { 00:20:59.111 "name": "BaseBdev4", 00:20:59.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.111 "is_configured": false, 00:20:59.111 "data_offset": 0, 00:20:59.111 "data_size": 0 00:20:59.111 } 00:20:59.111 ] 00:20:59.111 }' 00:20:59.111 12:02:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.111 12:02:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:59.676 12:02:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:59.934 [2024-07-15 12:02:13.375994] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:59.934 BaseBdev3 00:20:59.934 12:02:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:59.934 12:02:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:20:59.934 12:02:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:20:59.934 12:02:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:20:59.934 12:02:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:20:59.934 12:02:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:20:59.934 12:02:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:00.192 12:02:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:00.759 [ 00:21:00.759 { 00:21:00.759 "name": "BaseBdev3", 00:21:00.759 "aliases": [ 00:21:00.759 "6c67a250-b41a-45a8-b8d3-d2469bc025f2" 00:21:00.759 ], 00:21:00.759 "product_name": "Malloc disk", 00:21:00.759 "block_size": 512, 00:21:00.759 "num_blocks": 65536, 00:21:00.759 "uuid": "6c67a250-b41a-45a8-b8d3-d2469bc025f2", 00:21:00.759 "assigned_rate_limits": { 00:21:00.759 "rw_ios_per_sec": 0, 00:21:00.759 "rw_mbytes_per_sec": 0, 00:21:00.759 "r_mbytes_per_sec": 0, 00:21:00.759 "w_mbytes_per_sec": 0 00:21:00.759 }, 00:21:00.759 "claimed": true, 00:21:00.759 "claim_type": "exclusive_write", 00:21:00.759 "zoned": false, 00:21:00.759 "supported_io_types": { 00:21:00.759 "read": true, 00:21:00.759 "write": true, 00:21:00.759 "unmap": true, 00:21:00.759 "flush": true, 00:21:00.759 "reset": true, 00:21:00.759 "nvme_admin": false, 00:21:00.759 "nvme_io": false, 00:21:00.759 "nvme_io_md": false, 00:21:00.759 "write_zeroes": true, 00:21:00.759 "zcopy": true, 00:21:00.759 "get_zone_info": false, 00:21:00.759 "zone_management": false, 00:21:00.759 "zone_append": false, 00:21:00.759 "compare": false, 00:21:00.759 "compare_and_write": false, 00:21:00.759 "abort": true, 00:21:00.759 "seek_hole": false, 00:21:00.759 "seek_data": false, 00:21:00.759 "copy": true, 00:21:00.759 "nvme_iov_md": false 00:21:00.759 }, 00:21:00.759 "memory_domains": [ 00:21:00.759 { 00:21:00.759 "dma_device_id": "system", 00:21:00.759 "dma_device_type": 1 00:21:00.759 }, 00:21:00.759 { 00:21:00.759 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.759 "dma_device_type": 2 00:21:00.759 } 00:21:00.759 ], 00:21:00.759 "driver_specific": {} 00:21:00.759 } 00:21:00.759 ] 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:00.759 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.759 "name": "Existed_Raid", 00:21:00.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.759 "strip_size_kb": 64, 00:21:00.759 "state": "configuring", 00:21:00.759 "raid_level": "concat", 00:21:00.759 "superblock": false, 00:21:00.759 "num_base_bdevs": 4, 00:21:00.759 "num_base_bdevs_discovered": 3, 00:21:00.759 "num_base_bdevs_operational": 4, 00:21:00.759 "base_bdevs_list": [ 00:21:00.759 { 00:21:00.759 "name": "BaseBdev1", 00:21:00.759 "uuid": "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7", 00:21:00.759 "is_configured": true, 00:21:00.759 "data_offset": 0, 00:21:00.759 "data_size": 65536 00:21:00.759 }, 00:21:00.759 { 00:21:00.759 "name": "BaseBdev2", 00:21:00.759 "uuid": "1bd676db-630e-4496-982d-808ca0a61e75", 00:21:00.759 "is_configured": true, 00:21:00.759 "data_offset": 0, 00:21:00.759 "data_size": 65536 00:21:00.759 }, 00:21:00.759 { 00:21:00.759 "name": "BaseBdev3", 00:21:00.760 "uuid": "6c67a250-b41a-45a8-b8d3-d2469bc025f2", 00:21:00.760 "is_configured": true, 00:21:00.760 "data_offset": 0, 00:21:00.760 "data_size": 65536 00:21:00.760 }, 00:21:00.760 { 00:21:00.760 "name": "BaseBdev4", 00:21:00.760 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.760 "is_configured": false, 00:21:00.760 "data_offset": 0, 00:21:00.760 "data_size": 0 00:21:00.760 } 00:21:00.760 ] 00:21:00.760 }' 00:21:00.760 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.760 12:02:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:01.696 12:02:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:01.696 [2024-07-15 12:02:15.164230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:01.696 [2024-07-15 12:02:15.164266] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25ea4a0 00:21:01.696 [2024-07-15 12:02:15.164280] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:21:01.696 [2024-07-15 12:02:15.164530] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25ea0a0 00:21:01.696 [2024-07-15 12:02:15.164654] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25ea4a0 00:21:01.696 [2024-07-15 12:02:15.164664] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25ea4a0 00:21:01.696 [2024-07-15 12:02:15.164840] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:01.696 BaseBdev4 00:21:01.696 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:01.696 12:02:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:01.696 12:02:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:01.696 12:02:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:01.696 12:02:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:01.696 12:02:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:01.696 12:02:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:01.956 12:02:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:01.956 [ 00:21:01.956 { 00:21:01.956 "name": "BaseBdev4", 00:21:01.956 "aliases": [ 00:21:01.956 "67e0f484-d9dd-4aaa-aa19-b4d51dbd78ab" 00:21:01.956 ], 00:21:01.956 "product_name": "Malloc disk", 00:21:01.956 "block_size": 512, 00:21:01.956 "num_blocks": 65536, 00:21:01.956 "uuid": "67e0f484-d9dd-4aaa-aa19-b4d51dbd78ab", 00:21:01.956 "assigned_rate_limits": { 00:21:01.956 "rw_ios_per_sec": 0, 00:21:01.956 "rw_mbytes_per_sec": 0, 00:21:01.956 "r_mbytes_per_sec": 0, 00:21:01.956 "w_mbytes_per_sec": 0 00:21:01.956 }, 00:21:01.956 "claimed": true, 00:21:01.956 "claim_type": "exclusive_write", 00:21:01.956 "zoned": false, 00:21:01.956 "supported_io_types": { 00:21:01.956 "read": true, 00:21:01.956 "write": true, 00:21:01.956 "unmap": true, 00:21:01.956 "flush": true, 00:21:01.956 "reset": true, 00:21:01.956 "nvme_admin": false, 00:21:01.956 "nvme_io": false, 00:21:01.956 "nvme_io_md": false, 00:21:01.956 "write_zeroes": true, 00:21:01.956 "zcopy": true, 00:21:01.956 "get_zone_info": false, 00:21:01.956 "zone_management": false, 00:21:01.956 "zone_append": false, 00:21:01.956 "compare": false, 00:21:01.956 "compare_and_write": false, 00:21:01.956 "abort": true, 00:21:01.956 "seek_hole": false, 00:21:01.956 "seek_data": false, 00:21:01.956 "copy": true, 00:21:01.956 "nvme_iov_md": false 00:21:01.956 }, 00:21:01.956 "memory_domains": [ 00:21:01.957 { 00:21:01.957 "dma_device_id": "system", 00:21:01.957 "dma_device_type": 1 00:21:01.957 }, 00:21:01.957 { 00:21:01.957 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:01.957 "dma_device_type": 2 00:21:01.957 } 00:21:01.957 ], 00:21:01.957 "driver_specific": {} 00:21:01.957 } 00:21:01.957 ] 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.957 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:02.216 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.216 "name": "Existed_Raid", 00:21:02.216 "uuid": "ae164fbc-6d50-49ff-99ff-e16ce0d59f7d", 00:21:02.216 "strip_size_kb": 64, 00:21:02.216 "state": "online", 00:21:02.216 "raid_level": "concat", 00:21:02.216 "superblock": false, 00:21:02.216 "num_base_bdevs": 4, 00:21:02.216 "num_base_bdevs_discovered": 4, 00:21:02.216 "num_base_bdevs_operational": 4, 00:21:02.216 "base_bdevs_list": [ 00:21:02.216 { 00:21:02.216 "name": "BaseBdev1", 00:21:02.216 "uuid": "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7", 00:21:02.216 "is_configured": true, 00:21:02.216 "data_offset": 0, 00:21:02.216 "data_size": 65536 00:21:02.216 }, 00:21:02.216 { 00:21:02.216 "name": "BaseBdev2", 00:21:02.216 "uuid": "1bd676db-630e-4496-982d-808ca0a61e75", 00:21:02.216 "is_configured": true, 00:21:02.216 "data_offset": 0, 00:21:02.216 "data_size": 65536 00:21:02.216 }, 00:21:02.216 { 00:21:02.216 "name": "BaseBdev3", 00:21:02.216 "uuid": "6c67a250-b41a-45a8-b8d3-d2469bc025f2", 00:21:02.216 "is_configured": true, 00:21:02.216 "data_offset": 0, 00:21:02.216 "data_size": 65536 00:21:02.216 }, 00:21:02.216 { 00:21:02.216 "name": "BaseBdev4", 00:21:02.216 "uuid": "67e0f484-d9dd-4aaa-aa19-b4d51dbd78ab", 00:21:02.216 "is_configured": true, 00:21:02.216 "data_offset": 0, 00:21:02.216 "data_size": 65536 00:21:02.216 } 00:21:02.216 ] 00:21:02.216 }' 00:21:02.216 12:02:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.216 12:02:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:02.787 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:02.787 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:02.787 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:02.787 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:02.787 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:02.787 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:02.787 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:02.787 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:02.787 [2024-07-15 12:02:16.371776] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:03.070 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:03.070 "name": "Existed_Raid", 00:21:03.070 "aliases": [ 00:21:03.070 "ae164fbc-6d50-49ff-99ff-e16ce0d59f7d" 00:21:03.070 ], 00:21:03.070 "product_name": "Raid Volume", 00:21:03.070 "block_size": 512, 00:21:03.070 "num_blocks": 262144, 00:21:03.070 "uuid": "ae164fbc-6d50-49ff-99ff-e16ce0d59f7d", 00:21:03.070 "assigned_rate_limits": { 00:21:03.070 "rw_ios_per_sec": 0, 00:21:03.071 "rw_mbytes_per_sec": 0, 00:21:03.071 "r_mbytes_per_sec": 0, 00:21:03.071 "w_mbytes_per_sec": 0 00:21:03.071 }, 00:21:03.071 "claimed": false, 00:21:03.071 "zoned": false, 00:21:03.071 "supported_io_types": { 00:21:03.071 "read": true, 00:21:03.071 "write": true, 00:21:03.071 "unmap": true, 00:21:03.071 "flush": true, 00:21:03.071 "reset": true, 00:21:03.071 "nvme_admin": false, 00:21:03.071 "nvme_io": false, 00:21:03.071 "nvme_io_md": false, 00:21:03.071 "write_zeroes": true, 00:21:03.071 "zcopy": false, 00:21:03.071 "get_zone_info": false, 00:21:03.071 "zone_management": false, 00:21:03.071 "zone_append": false, 00:21:03.071 "compare": false, 00:21:03.071 "compare_and_write": false, 00:21:03.071 "abort": false, 00:21:03.071 "seek_hole": false, 00:21:03.071 "seek_data": false, 00:21:03.071 "copy": false, 00:21:03.071 "nvme_iov_md": false 00:21:03.071 }, 00:21:03.071 "memory_domains": [ 00:21:03.071 { 00:21:03.071 "dma_device_id": "system", 00:21:03.071 "dma_device_type": 1 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.071 "dma_device_type": 2 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "dma_device_id": "system", 00:21:03.071 "dma_device_type": 1 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.071 "dma_device_type": 2 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "dma_device_id": "system", 00:21:03.071 "dma_device_type": 1 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.071 "dma_device_type": 2 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "dma_device_id": "system", 00:21:03.071 "dma_device_type": 1 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.071 "dma_device_type": 2 00:21:03.071 } 00:21:03.071 ], 00:21:03.071 "driver_specific": { 00:21:03.071 "raid": { 00:21:03.071 "uuid": "ae164fbc-6d50-49ff-99ff-e16ce0d59f7d", 00:21:03.071 "strip_size_kb": 64, 00:21:03.071 "state": "online", 00:21:03.071 "raid_level": "concat", 00:21:03.071 "superblock": false, 00:21:03.071 "num_base_bdevs": 4, 00:21:03.071 "num_base_bdevs_discovered": 4, 00:21:03.071 "num_base_bdevs_operational": 4, 00:21:03.071 "base_bdevs_list": [ 00:21:03.071 { 00:21:03.071 "name": "BaseBdev1", 00:21:03.071 "uuid": "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7", 00:21:03.071 "is_configured": true, 00:21:03.071 "data_offset": 0, 00:21:03.071 "data_size": 65536 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "name": "BaseBdev2", 00:21:03.071 "uuid": "1bd676db-630e-4496-982d-808ca0a61e75", 00:21:03.071 "is_configured": true, 00:21:03.071 "data_offset": 0, 00:21:03.071 "data_size": 65536 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "name": "BaseBdev3", 00:21:03.071 "uuid": "6c67a250-b41a-45a8-b8d3-d2469bc025f2", 00:21:03.071 "is_configured": true, 00:21:03.071 "data_offset": 0, 00:21:03.071 "data_size": 65536 00:21:03.071 }, 00:21:03.071 { 00:21:03.071 "name": "BaseBdev4", 00:21:03.071 "uuid": "67e0f484-d9dd-4aaa-aa19-b4d51dbd78ab", 00:21:03.071 "is_configured": true, 00:21:03.071 "data_offset": 0, 00:21:03.071 "data_size": 65536 00:21:03.071 } 00:21:03.071 ] 00:21:03.071 } 00:21:03.071 } 00:21:03.071 }' 00:21:03.071 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:03.071 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:03.071 BaseBdev2 00:21:03.071 BaseBdev3 00:21:03.071 BaseBdev4' 00:21:03.071 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:03.071 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:03.071 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:03.329 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:03.329 "name": "BaseBdev1", 00:21:03.329 "aliases": [ 00:21:03.329 "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7" 00:21:03.329 ], 00:21:03.329 "product_name": "Malloc disk", 00:21:03.329 "block_size": 512, 00:21:03.329 "num_blocks": 65536, 00:21:03.329 "uuid": "9e3de2bb-a8e4-4b93-b5b6-e59be61c27c7", 00:21:03.329 "assigned_rate_limits": { 00:21:03.330 "rw_ios_per_sec": 0, 00:21:03.330 "rw_mbytes_per_sec": 0, 00:21:03.330 "r_mbytes_per_sec": 0, 00:21:03.330 "w_mbytes_per_sec": 0 00:21:03.330 }, 00:21:03.330 "claimed": true, 00:21:03.330 "claim_type": "exclusive_write", 00:21:03.330 "zoned": false, 00:21:03.330 "supported_io_types": { 00:21:03.330 "read": true, 00:21:03.330 "write": true, 00:21:03.330 "unmap": true, 00:21:03.330 "flush": true, 00:21:03.330 "reset": true, 00:21:03.330 "nvme_admin": false, 00:21:03.330 "nvme_io": false, 00:21:03.330 "nvme_io_md": false, 00:21:03.330 "write_zeroes": true, 00:21:03.330 "zcopy": true, 00:21:03.330 "get_zone_info": false, 00:21:03.330 "zone_management": false, 00:21:03.330 "zone_append": false, 00:21:03.330 "compare": false, 00:21:03.330 "compare_and_write": false, 00:21:03.330 "abort": true, 00:21:03.330 "seek_hole": false, 00:21:03.330 "seek_data": false, 00:21:03.330 "copy": true, 00:21:03.330 "nvme_iov_md": false 00:21:03.330 }, 00:21:03.330 "memory_domains": [ 00:21:03.330 { 00:21:03.330 "dma_device_id": "system", 00:21:03.330 "dma_device_type": 1 00:21:03.330 }, 00:21:03.330 { 00:21:03.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.330 "dma_device_type": 2 00:21:03.330 } 00:21:03.330 ], 00:21:03.330 "driver_specific": {} 00:21:03.330 }' 00:21:03.330 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.330 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.330 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:03.330 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.330 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.330 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:03.330 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.330 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:03.589 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:03.589 12:02:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.589 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:03.589 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:03.589 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:03.589 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:03.589 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:03.848 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:03.848 "name": "BaseBdev2", 00:21:03.848 "aliases": [ 00:21:03.848 "1bd676db-630e-4496-982d-808ca0a61e75" 00:21:03.848 ], 00:21:03.848 "product_name": "Malloc disk", 00:21:03.848 "block_size": 512, 00:21:03.848 "num_blocks": 65536, 00:21:03.848 "uuid": "1bd676db-630e-4496-982d-808ca0a61e75", 00:21:03.848 "assigned_rate_limits": { 00:21:03.848 "rw_ios_per_sec": 0, 00:21:03.848 "rw_mbytes_per_sec": 0, 00:21:03.848 "r_mbytes_per_sec": 0, 00:21:03.848 "w_mbytes_per_sec": 0 00:21:03.848 }, 00:21:03.848 "claimed": true, 00:21:03.848 "claim_type": "exclusive_write", 00:21:03.848 "zoned": false, 00:21:03.848 "supported_io_types": { 00:21:03.848 "read": true, 00:21:03.848 "write": true, 00:21:03.848 "unmap": true, 00:21:03.848 "flush": true, 00:21:03.848 "reset": true, 00:21:03.848 "nvme_admin": false, 00:21:03.848 "nvme_io": false, 00:21:03.848 "nvme_io_md": false, 00:21:03.848 "write_zeroes": true, 00:21:03.848 "zcopy": true, 00:21:03.848 "get_zone_info": false, 00:21:03.848 "zone_management": false, 00:21:03.848 "zone_append": false, 00:21:03.848 "compare": false, 00:21:03.848 "compare_and_write": false, 00:21:03.848 "abort": true, 00:21:03.848 "seek_hole": false, 00:21:03.848 "seek_data": false, 00:21:03.848 "copy": true, 00:21:03.848 "nvme_iov_md": false 00:21:03.848 }, 00:21:03.848 "memory_domains": [ 00:21:03.848 { 00:21:03.848 "dma_device_id": "system", 00:21:03.848 "dma_device_type": 1 00:21:03.848 }, 00:21:03.848 { 00:21:03.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.848 "dma_device_type": 2 00:21:03.848 } 00:21:03.848 ], 00:21:03.848 "driver_specific": {} 00:21:03.848 }' 00:21:03.848 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.848 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:03.848 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:03.848 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:03.848 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:04.107 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:04.367 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:04.367 "name": "BaseBdev3", 00:21:04.367 "aliases": [ 00:21:04.367 "6c67a250-b41a-45a8-b8d3-d2469bc025f2" 00:21:04.367 ], 00:21:04.367 "product_name": "Malloc disk", 00:21:04.367 "block_size": 512, 00:21:04.367 "num_blocks": 65536, 00:21:04.367 "uuid": "6c67a250-b41a-45a8-b8d3-d2469bc025f2", 00:21:04.367 "assigned_rate_limits": { 00:21:04.367 "rw_ios_per_sec": 0, 00:21:04.367 "rw_mbytes_per_sec": 0, 00:21:04.367 "r_mbytes_per_sec": 0, 00:21:04.367 "w_mbytes_per_sec": 0 00:21:04.367 }, 00:21:04.367 "claimed": true, 00:21:04.367 "claim_type": "exclusive_write", 00:21:04.367 "zoned": false, 00:21:04.367 "supported_io_types": { 00:21:04.367 "read": true, 00:21:04.367 "write": true, 00:21:04.367 "unmap": true, 00:21:04.367 "flush": true, 00:21:04.367 "reset": true, 00:21:04.367 "nvme_admin": false, 00:21:04.367 "nvme_io": false, 00:21:04.367 "nvme_io_md": false, 00:21:04.367 "write_zeroes": true, 00:21:04.367 "zcopy": true, 00:21:04.367 "get_zone_info": false, 00:21:04.367 "zone_management": false, 00:21:04.367 "zone_append": false, 00:21:04.367 "compare": false, 00:21:04.367 "compare_and_write": false, 00:21:04.367 "abort": true, 00:21:04.367 "seek_hole": false, 00:21:04.367 "seek_data": false, 00:21:04.367 "copy": true, 00:21:04.367 "nvme_iov_md": false 00:21:04.367 }, 00:21:04.367 "memory_domains": [ 00:21:04.367 { 00:21:04.367 "dma_device_id": "system", 00:21:04.367 "dma_device_type": 1 00:21:04.367 }, 00:21:04.367 { 00:21:04.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.367 "dma_device_type": 2 00:21:04.367 } 00:21:04.367 ], 00:21:04.367 "driver_specific": {} 00:21:04.367 }' 00:21:04.367 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:04.626 12:02:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:04.626 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:04.626 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:04.626 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:04.626 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:04.626 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:04.626 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:04.626 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:04.626 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:04.884 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:04.884 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:04.884 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:04.884 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:04.884 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:05.143 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:05.143 "name": "BaseBdev4", 00:21:05.143 "aliases": [ 00:21:05.143 "67e0f484-d9dd-4aaa-aa19-b4d51dbd78ab" 00:21:05.143 ], 00:21:05.143 "product_name": "Malloc disk", 00:21:05.143 "block_size": 512, 00:21:05.143 "num_blocks": 65536, 00:21:05.143 "uuid": "67e0f484-d9dd-4aaa-aa19-b4d51dbd78ab", 00:21:05.143 "assigned_rate_limits": { 00:21:05.143 "rw_ios_per_sec": 0, 00:21:05.143 "rw_mbytes_per_sec": 0, 00:21:05.143 "r_mbytes_per_sec": 0, 00:21:05.143 "w_mbytes_per_sec": 0 00:21:05.143 }, 00:21:05.143 "claimed": true, 00:21:05.143 "claim_type": "exclusive_write", 00:21:05.143 "zoned": false, 00:21:05.143 "supported_io_types": { 00:21:05.143 "read": true, 00:21:05.143 "write": true, 00:21:05.143 "unmap": true, 00:21:05.143 "flush": true, 00:21:05.143 "reset": true, 00:21:05.143 "nvme_admin": false, 00:21:05.143 "nvme_io": false, 00:21:05.143 "nvme_io_md": false, 00:21:05.143 "write_zeroes": true, 00:21:05.143 "zcopy": true, 00:21:05.143 "get_zone_info": false, 00:21:05.143 "zone_management": false, 00:21:05.143 "zone_append": false, 00:21:05.143 "compare": false, 00:21:05.143 "compare_and_write": false, 00:21:05.143 "abort": true, 00:21:05.143 "seek_hole": false, 00:21:05.143 "seek_data": false, 00:21:05.143 "copy": true, 00:21:05.143 "nvme_iov_md": false 00:21:05.143 }, 00:21:05.143 "memory_domains": [ 00:21:05.143 { 00:21:05.143 "dma_device_id": "system", 00:21:05.143 "dma_device_type": 1 00:21:05.143 }, 00:21:05.143 { 00:21:05.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.143 "dma_device_type": 2 00:21:05.143 } 00:21:05.143 ], 00:21:05.143 "driver_specific": {} 00:21:05.143 }' 00:21:05.143 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:05.143 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:05.143 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:05.143 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:05.143 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:05.143 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:05.143 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:05.402 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:05.402 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:05.402 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:05.402 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:05.402 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:05.402 12:02:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:05.662 [2024-07-15 12:02:19.114779] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:05.662 [2024-07-15 12:02:19.114817] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:05.662 [2024-07-15 12:02:19.114885] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.662 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:05.921 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:05.921 "name": "Existed_Raid", 00:21:05.921 "uuid": "ae164fbc-6d50-49ff-99ff-e16ce0d59f7d", 00:21:05.921 "strip_size_kb": 64, 00:21:05.921 "state": "offline", 00:21:05.921 "raid_level": "concat", 00:21:05.921 "superblock": false, 00:21:05.921 "num_base_bdevs": 4, 00:21:05.921 "num_base_bdevs_discovered": 3, 00:21:05.921 "num_base_bdevs_operational": 3, 00:21:05.921 "base_bdevs_list": [ 00:21:05.921 { 00:21:05.921 "name": null, 00:21:05.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:05.921 "is_configured": false, 00:21:05.921 "data_offset": 0, 00:21:05.921 "data_size": 65536 00:21:05.921 }, 00:21:05.921 { 00:21:05.921 "name": "BaseBdev2", 00:21:05.921 "uuid": "1bd676db-630e-4496-982d-808ca0a61e75", 00:21:05.921 "is_configured": true, 00:21:05.921 "data_offset": 0, 00:21:05.921 "data_size": 65536 00:21:05.921 }, 00:21:05.921 { 00:21:05.921 "name": "BaseBdev3", 00:21:05.921 "uuid": "6c67a250-b41a-45a8-b8d3-d2469bc025f2", 00:21:05.921 "is_configured": true, 00:21:05.921 "data_offset": 0, 00:21:05.921 "data_size": 65536 00:21:05.921 }, 00:21:05.921 { 00:21:05.921 "name": "BaseBdev4", 00:21:05.921 "uuid": "67e0f484-d9dd-4aaa-aa19-b4d51dbd78ab", 00:21:05.921 "is_configured": true, 00:21:05.921 "data_offset": 0, 00:21:05.921 "data_size": 65536 00:21:05.921 } 00:21:05.921 ] 00:21:05.921 }' 00:21:05.921 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:05.921 12:02:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:06.489 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:06.489 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:06.489 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.489 12:02:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:06.749 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:06.749 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:06.749 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:07.009 [2024-07-15 12:02:20.464247] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:07.009 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:07.009 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:07.009 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.009 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:07.269 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:07.269 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:07.269 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:07.528 [2024-07-15 12:02:20.966005] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:07.528 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:07.528 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:07.528 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.528 12:02:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:07.788 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:07.788 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:07.788 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:08.047 [2024-07-15 12:02:21.471729] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:08.047 [2024-07-15 12:02:21.471775] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25ea4a0 name Existed_Raid, state offline 00:21:08.047 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:08.047 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:08.047 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.047 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:08.306 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:08.306 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:08.306 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:08.306 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:08.306 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:08.306 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:08.565 BaseBdev2 00:21:08.565 12:02:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:08.565 12:02:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:08.565 12:02:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:08.565 12:02:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:08.565 12:02:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:08.565 12:02:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:08.565 12:02:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:08.824 12:02:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:09.084 [ 00:21:09.084 { 00:21:09.084 "name": "BaseBdev2", 00:21:09.084 "aliases": [ 00:21:09.084 "eefc26a1-85e4-489e-bcf9-42435586a15c" 00:21:09.084 ], 00:21:09.084 "product_name": "Malloc disk", 00:21:09.084 "block_size": 512, 00:21:09.084 "num_blocks": 65536, 00:21:09.084 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:09.084 "assigned_rate_limits": { 00:21:09.084 "rw_ios_per_sec": 0, 00:21:09.084 "rw_mbytes_per_sec": 0, 00:21:09.084 "r_mbytes_per_sec": 0, 00:21:09.084 "w_mbytes_per_sec": 0 00:21:09.084 }, 00:21:09.084 "claimed": false, 00:21:09.084 "zoned": false, 00:21:09.085 "supported_io_types": { 00:21:09.085 "read": true, 00:21:09.085 "write": true, 00:21:09.085 "unmap": true, 00:21:09.085 "flush": true, 00:21:09.085 "reset": true, 00:21:09.085 "nvme_admin": false, 00:21:09.085 "nvme_io": false, 00:21:09.085 "nvme_io_md": false, 00:21:09.085 "write_zeroes": true, 00:21:09.085 "zcopy": true, 00:21:09.085 "get_zone_info": false, 00:21:09.085 "zone_management": false, 00:21:09.085 "zone_append": false, 00:21:09.085 "compare": false, 00:21:09.085 "compare_and_write": false, 00:21:09.085 "abort": true, 00:21:09.085 "seek_hole": false, 00:21:09.085 "seek_data": false, 00:21:09.085 "copy": true, 00:21:09.085 "nvme_iov_md": false 00:21:09.085 }, 00:21:09.085 "memory_domains": [ 00:21:09.085 { 00:21:09.085 "dma_device_id": "system", 00:21:09.085 "dma_device_type": 1 00:21:09.085 }, 00:21:09.085 { 00:21:09.085 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:09.085 "dma_device_type": 2 00:21:09.085 } 00:21:09.085 ], 00:21:09.085 "driver_specific": {} 00:21:09.085 } 00:21:09.085 ] 00:21:09.085 12:02:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:09.085 12:02:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:09.085 12:02:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:09.085 12:02:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:09.345 BaseBdev3 00:21:09.345 12:02:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:09.345 12:02:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:09.345 12:02:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:09.345 12:02:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:09.345 12:02:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:09.345 12:02:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:09.345 12:02:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:09.605 12:02:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:09.605 [ 00:21:09.605 { 00:21:09.605 "name": "BaseBdev3", 00:21:09.605 "aliases": [ 00:21:09.605 "7e888f3b-5265-4523-8429-8a94044d8ea6" 00:21:09.605 ], 00:21:09.605 "product_name": "Malloc disk", 00:21:09.605 "block_size": 512, 00:21:09.605 "num_blocks": 65536, 00:21:09.605 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:09.605 "assigned_rate_limits": { 00:21:09.605 "rw_ios_per_sec": 0, 00:21:09.605 "rw_mbytes_per_sec": 0, 00:21:09.605 "r_mbytes_per_sec": 0, 00:21:09.605 "w_mbytes_per_sec": 0 00:21:09.605 }, 00:21:09.605 "claimed": false, 00:21:09.605 "zoned": false, 00:21:09.605 "supported_io_types": { 00:21:09.605 "read": true, 00:21:09.605 "write": true, 00:21:09.605 "unmap": true, 00:21:09.605 "flush": true, 00:21:09.605 "reset": true, 00:21:09.605 "nvme_admin": false, 00:21:09.605 "nvme_io": false, 00:21:09.605 "nvme_io_md": false, 00:21:09.605 "write_zeroes": true, 00:21:09.605 "zcopy": true, 00:21:09.605 "get_zone_info": false, 00:21:09.605 "zone_management": false, 00:21:09.606 "zone_append": false, 00:21:09.606 "compare": false, 00:21:09.606 "compare_and_write": false, 00:21:09.606 "abort": true, 00:21:09.606 "seek_hole": false, 00:21:09.606 "seek_data": false, 00:21:09.606 "copy": true, 00:21:09.606 "nvme_iov_md": false 00:21:09.606 }, 00:21:09.606 "memory_domains": [ 00:21:09.606 { 00:21:09.606 "dma_device_id": "system", 00:21:09.606 "dma_device_type": 1 00:21:09.606 }, 00:21:09.606 { 00:21:09.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:09.606 "dma_device_type": 2 00:21:09.606 } 00:21:09.606 ], 00:21:09.606 "driver_specific": {} 00:21:09.606 } 00:21:09.606 ] 00:21:09.864 12:02:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:09.864 12:02:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:09.864 12:02:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:09.864 12:02:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:09.864 BaseBdev4 00:21:09.864 12:02:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:09.864 12:02:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:09.864 12:02:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:10.123 12:02:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:10.123 12:02:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:10.123 12:02:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:10.123 12:02:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:10.123 12:02:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:10.383 [ 00:21:10.383 { 00:21:10.383 "name": "BaseBdev4", 00:21:10.383 "aliases": [ 00:21:10.383 "36941fe8-080b-48b9-a1b8-701743dfeafd" 00:21:10.383 ], 00:21:10.383 "product_name": "Malloc disk", 00:21:10.383 "block_size": 512, 00:21:10.383 "num_blocks": 65536, 00:21:10.383 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:10.383 "assigned_rate_limits": { 00:21:10.383 "rw_ios_per_sec": 0, 00:21:10.383 "rw_mbytes_per_sec": 0, 00:21:10.383 "r_mbytes_per_sec": 0, 00:21:10.383 "w_mbytes_per_sec": 0 00:21:10.383 }, 00:21:10.383 "claimed": false, 00:21:10.383 "zoned": false, 00:21:10.383 "supported_io_types": { 00:21:10.383 "read": true, 00:21:10.383 "write": true, 00:21:10.383 "unmap": true, 00:21:10.383 "flush": true, 00:21:10.383 "reset": true, 00:21:10.383 "nvme_admin": false, 00:21:10.383 "nvme_io": false, 00:21:10.383 "nvme_io_md": false, 00:21:10.383 "write_zeroes": true, 00:21:10.383 "zcopy": true, 00:21:10.383 "get_zone_info": false, 00:21:10.383 "zone_management": false, 00:21:10.383 "zone_append": false, 00:21:10.383 "compare": false, 00:21:10.383 "compare_and_write": false, 00:21:10.383 "abort": true, 00:21:10.383 "seek_hole": false, 00:21:10.383 "seek_data": false, 00:21:10.383 "copy": true, 00:21:10.383 "nvme_iov_md": false 00:21:10.383 }, 00:21:10.383 "memory_domains": [ 00:21:10.383 { 00:21:10.383 "dma_device_id": "system", 00:21:10.383 "dma_device_type": 1 00:21:10.383 }, 00:21:10.383 { 00:21:10.383 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:10.383 "dma_device_type": 2 00:21:10.383 } 00:21:10.383 ], 00:21:10.383 "driver_specific": {} 00:21:10.383 } 00:21:10.383 ] 00:21:10.383 12:02:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:10.383 12:02:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:10.383 12:02:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:10.383 12:02:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:10.642 [2024-07-15 12:02:24.144648] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:10.642 [2024-07-15 12:02:24.144695] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:10.642 [2024-07-15 12:02:24.144715] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:10.642 [2024-07-15 12:02:24.146032] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:10.642 [2024-07-15 12:02:24.146073] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.642 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:10.901 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:10.901 "name": "Existed_Raid", 00:21:10.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.902 "strip_size_kb": 64, 00:21:10.902 "state": "configuring", 00:21:10.902 "raid_level": "concat", 00:21:10.902 "superblock": false, 00:21:10.902 "num_base_bdevs": 4, 00:21:10.902 "num_base_bdevs_discovered": 3, 00:21:10.902 "num_base_bdevs_operational": 4, 00:21:10.902 "base_bdevs_list": [ 00:21:10.902 { 00:21:10.902 "name": "BaseBdev1", 00:21:10.902 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:10.902 "is_configured": false, 00:21:10.902 "data_offset": 0, 00:21:10.902 "data_size": 0 00:21:10.902 }, 00:21:10.902 { 00:21:10.902 "name": "BaseBdev2", 00:21:10.902 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:10.902 "is_configured": true, 00:21:10.902 "data_offset": 0, 00:21:10.902 "data_size": 65536 00:21:10.902 }, 00:21:10.902 { 00:21:10.902 "name": "BaseBdev3", 00:21:10.902 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:10.902 "is_configured": true, 00:21:10.902 "data_offset": 0, 00:21:10.902 "data_size": 65536 00:21:10.902 }, 00:21:10.902 { 00:21:10.902 "name": "BaseBdev4", 00:21:10.902 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:10.902 "is_configured": true, 00:21:10.902 "data_offset": 0, 00:21:10.902 "data_size": 65536 00:21:10.902 } 00:21:10.902 ] 00:21:10.902 }' 00:21:10.902 12:02:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:10.902 12:02:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:11.470 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:11.730 [2024-07-15 12:02:25.175425] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:11.730 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:11.989 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:11.989 "name": "Existed_Raid", 00:21:11.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:11.989 "strip_size_kb": 64, 00:21:11.989 "state": "configuring", 00:21:11.989 "raid_level": "concat", 00:21:11.989 "superblock": false, 00:21:11.989 "num_base_bdevs": 4, 00:21:11.989 "num_base_bdevs_discovered": 2, 00:21:11.989 "num_base_bdevs_operational": 4, 00:21:11.989 "base_bdevs_list": [ 00:21:11.989 { 00:21:11.989 "name": "BaseBdev1", 00:21:11.989 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:11.989 "is_configured": false, 00:21:11.989 "data_offset": 0, 00:21:11.989 "data_size": 0 00:21:11.989 }, 00:21:11.989 { 00:21:11.989 "name": null, 00:21:11.989 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:11.989 "is_configured": false, 00:21:11.989 "data_offset": 0, 00:21:11.989 "data_size": 65536 00:21:11.989 }, 00:21:11.989 { 00:21:11.989 "name": "BaseBdev3", 00:21:11.989 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:11.989 "is_configured": true, 00:21:11.989 "data_offset": 0, 00:21:11.989 "data_size": 65536 00:21:11.989 }, 00:21:11.989 { 00:21:11.989 "name": "BaseBdev4", 00:21:11.989 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:11.989 "is_configured": true, 00:21:11.989 "data_offset": 0, 00:21:11.989 "data_size": 65536 00:21:11.989 } 00:21:11.989 ] 00:21:11.989 }' 00:21:11.989 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:11.989 12:02:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:12.556 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.556 12:02:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:12.814 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:12.814 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:12.814 [2024-07-15 12:02:26.395296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:12.814 BaseBdev1 00:21:13.071 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:13.071 12:02:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:13.071 12:02:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:13.071 12:02:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:13.071 12:02:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:13.071 12:02:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:13.071 12:02:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:13.071 12:02:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:13.329 [ 00:21:13.329 { 00:21:13.329 "name": "BaseBdev1", 00:21:13.329 "aliases": [ 00:21:13.329 "c496bc2b-4d22-4e2d-9d68-0d798d7370b6" 00:21:13.329 ], 00:21:13.329 "product_name": "Malloc disk", 00:21:13.329 "block_size": 512, 00:21:13.329 "num_blocks": 65536, 00:21:13.329 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:13.329 "assigned_rate_limits": { 00:21:13.329 "rw_ios_per_sec": 0, 00:21:13.329 "rw_mbytes_per_sec": 0, 00:21:13.329 "r_mbytes_per_sec": 0, 00:21:13.329 "w_mbytes_per_sec": 0 00:21:13.329 }, 00:21:13.329 "claimed": true, 00:21:13.329 "claim_type": "exclusive_write", 00:21:13.329 "zoned": false, 00:21:13.329 "supported_io_types": { 00:21:13.329 "read": true, 00:21:13.329 "write": true, 00:21:13.329 "unmap": true, 00:21:13.329 "flush": true, 00:21:13.329 "reset": true, 00:21:13.329 "nvme_admin": false, 00:21:13.329 "nvme_io": false, 00:21:13.329 "nvme_io_md": false, 00:21:13.329 "write_zeroes": true, 00:21:13.329 "zcopy": true, 00:21:13.329 "get_zone_info": false, 00:21:13.329 "zone_management": false, 00:21:13.329 "zone_append": false, 00:21:13.329 "compare": false, 00:21:13.329 "compare_and_write": false, 00:21:13.329 "abort": true, 00:21:13.329 "seek_hole": false, 00:21:13.329 "seek_data": false, 00:21:13.329 "copy": true, 00:21:13.329 "nvme_iov_md": false 00:21:13.329 }, 00:21:13.329 "memory_domains": [ 00:21:13.329 { 00:21:13.329 "dma_device_id": "system", 00:21:13.329 "dma_device_type": 1 00:21:13.329 }, 00:21:13.329 { 00:21:13.329 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:13.329 "dma_device_type": 2 00:21:13.329 } 00:21:13.329 ], 00:21:13.329 "driver_specific": {} 00:21:13.329 } 00:21:13.329 ] 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.329 12:02:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:13.588 12:02:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.588 "name": "Existed_Raid", 00:21:13.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.588 "strip_size_kb": 64, 00:21:13.588 "state": "configuring", 00:21:13.588 "raid_level": "concat", 00:21:13.588 "superblock": false, 00:21:13.588 "num_base_bdevs": 4, 00:21:13.588 "num_base_bdevs_discovered": 3, 00:21:13.588 "num_base_bdevs_operational": 4, 00:21:13.588 "base_bdevs_list": [ 00:21:13.588 { 00:21:13.588 "name": "BaseBdev1", 00:21:13.588 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:13.588 "is_configured": true, 00:21:13.588 "data_offset": 0, 00:21:13.588 "data_size": 65536 00:21:13.588 }, 00:21:13.588 { 00:21:13.588 "name": null, 00:21:13.588 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:13.588 "is_configured": false, 00:21:13.588 "data_offset": 0, 00:21:13.588 "data_size": 65536 00:21:13.588 }, 00:21:13.588 { 00:21:13.588 "name": "BaseBdev3", 00:21:13.588 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:13.588 "is_configured": true, 00:21:13.588 "data_offset": 0, 00:21:13.588 "data_size": 65536 00:21:13.588 }, 00:21:13.588 { 00:21:13.588 "name": "BaseBdev4", 00:21:13.588 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:13.588 "is_configured": true, 00:21:13.588 "data_offset": 0, 00:21:13.588 "data_size": 65536 00:21:13.588 } 00:21:13.588 ] 00:21:13.588 }' 00:21:13.588 12:02:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.588 12:02:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:14.524 12:02:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:14.524 12:02:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.524 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:14.524 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:14.783 [2024-07-15 12:02:28.272317] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.783 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:15.042 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.042 "name": "Existed_Raid", 00:21:15.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:15.042 "strip_size_kb": 64, 00:21:15.042 "state": "configuring", 00:21:15.042 "raid_level": "concat", 00:21:15.042 "superblock": false, 00:21:15.042 "num_base_bdevs": 4, 00:21:15.042 "num_base_bdevs_discovered": 2, 00:21:15.042 "num_base_bdevs_operational": 4, 00:21:15.042 "base_bdevs_list": [ 00:21:15.042 { 00:21:15.042 "name": "BaseBdev1", 00:21:15.042 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:15.042 "is_configured": true, 00:21:15.042 "data_offset": 0, 00:21:15.042 "data_size": 65536 00:21:15.042 }, 00:21:15.042 { 00:21:15.042 "name": null, 00:21:15.042 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:15.042 "is_configured": false, 00:21:15.042 "data_offset": 0, 00:21:15.042 "data_size": 65536 00:21:15.042 }, 00:21:15.042 { 00:21:15.042 "name": null, 00:21:15.042 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:15.042 "is_configured": false, 00:21:15.042 "data_offset": 0, 00:21:15.042 "data_size": 65536 00:21:15.042 }, 00:21:15.042 { 00:21:15.042 "name": "BaseBdev4", 00:21:15.042 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:15.042 "is_configured": true, 00:21:15.042 "data_offset": 0, 00:21:15.042 "data_size": 65536 00:21:15.042 } 00:21:15.042 ] 00:21:15.042 }' 00:21:15.042 12:02:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.042 12:02:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:15.609 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.609 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:15.868 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:15.868 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:16.128 [2024-07-15 12:02:29.611895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.128 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:16.385 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.385 "name": "Existed_Raid", 00:21:16.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:16.385 "strip_size_kb": 64, 00:21:16.385 "state": "configuring", 00:21:16.385 "raid_level": "concat", 00:21:16.385 "superblock": false, 00:21:16.385 "num_base_bdevs": 4, 00:21:16.385 "num_base_bdevs_discovered": 3, 00:21:16.385 "num_base_bdevs_operational": 4, 00:21:16.385 "base_bdevs_list": [ 00:21:16.385 { 00:21:16.385 "name": "BaseBdev1", 00:21:16.385 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:16.385 "is_configured": true, 00:21:16.385 "data_offset": 0, 00:21:16.385 "data_size": 65536 00:21:16.385 }, 00:21:16.385 { 00:21:16.385 "name": null, 00:21:16.386 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:16.386 "is_configured": false, 00:21:16.386 "data_offset": 0, 00:21:16.386 "data_size": 65536 00:21:16.386 }, 00:21:16.386 { 00:21:16.386 "name": "BaseBdev3", 00:21:16.386 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:16.386 "is_configured": true, 00:21:16.386 "data_offset": 0, 00:21:16.386 "data_size": 65536 00:21:16.386 }, 00:21:16.386 { 00:21:16.386 "name": "BaseBdev4", 00:21:16.386 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:16.386 "is_configured": true, 00:21:16.386 "data_offset": 0, 00:21:16.386 "data_size": 65536 00:21:16.386 } 00:21:16.386 ] 00:21:16.386 }' 00:21:16.386 12:02:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.386 12:02:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:16.953 12:02:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.953 12:02:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:17.211 12:02:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:17.211 12:02:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:17.470 [2024-07-15 12:02:31.015650] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.470 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:17.729 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:17.729 "name": "Existed_Raid", 00:21:17.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:17.729 "strip_size_kb": 64, 00:21:17.729 "state": "configuring", 00:21:17.729 "raid_level": "concat", 00:21:17.729 "superblock": false, 00:21:17.729 "num_base_bdevs": 4, 00:21:17.729 "num_base_bdevs_discovered": 2, 00:21:17.729 "num_base_bdevs_operational": 4, 00:21:17.729 "base_bdevs_list": [ 00:21:17.729 { 00:21:17.729 "name": null, 00:21:17.729 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:17.729 "is_configured": false, 00:21:17.729 "data_offset": 0, 00:21:17.729 "data_size": 65536 00:21:17.729 }, 00:21:17.729 { 00:21:17.729 "name": null, 00:21:17.729 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:17.729 "is_configured": false, 00:21:17.729 "data_offset": 0, 00:21:17.729 "data_size": 65536 00:21:17.729 }, 00:21:17.729 { 00:21:17.729 "name": "BaseBdev3", 00:21:17.729 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:17.729 "is_configured": true, 00:21:17.729 "data_offset": 0, 00:21:17.729 "data_size": 65536 00:21:17.729 }, 00:21:17.729 { 00:21:17.729 "name": "BaseBdev4", 00:21:17.729 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:17.729 "is_configured": true, 00:21:17.729 "data_offset": 0, 00:21:17.729 "data_size": 65536 00:21:17.729 } 00:21:17.729 ] 00:21:17.729 }' 00:21:17.729 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:17.729 12:02:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:18.664 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.664 12:02:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:18.664 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:18.664 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:18.922 [2024-07-15 12:02:32.383492] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.922 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:19.180 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.180 "name": "Existed_Raid", 00:21:19.180 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:19.180 "strip_size_kb": 64, 00:21:19.180 "state": "configuring", 00:21:19.180 "raid_level": "concat", 00:21:19.180 "superblock": false, 00:21:19.180 "num_base_bdevs": 4, 00:21:19.180 "num_base_bdevs_discovered": 3, 00:21:19.180 "num_base_bdevs_operational": 4, 00:21:19.180 "base_bdevs_list": [ 00:21:19.180 { 00:21:19.180 "name": null, 00:21:19.180 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:19.180 "is_configured": false, 00:21:19.180 "data_offset": 0, 00:21:19.180 "data_size": 65536 00:21:19.180 }, 00:21:19.180 { 00:21:19.180 "name": "BaseBdev2", 00:21:19.180 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:19.180 "is_configured": true, 00:21:19.180 "data_offset": 0, 00:21:19.180 "data_size": 65536 00:21:19.180 }, 00:21:19.180 { 00:21:19.180 "name": "BaseBdev3", 00:21:19.181 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:19.181 "is_configured": true, 00:21:19.181 "data_offset": 0, 00:21:19.181 "data_size": 65536 00:21:19.181 }, 00:21:19.181 { 00:21:19.181 "name": "BaseBdev4", 00:21:19.181 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:19.181 "is_configured": true, 00:21:19.181 "data_offset": 0, 00:21:19.181 "data_size": 65536 00:21:19.181 } 00:21:19.181 ] 00:21:19.181 }' 00:21:19.181 12:02:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.181 12:02:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:19.746 12:02:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.746 12:02:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:20.004 12:02:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:20.004 12:02:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.004 12:02:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:20.263 12:02:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u c496bc2b-4d22-4e2d-9d68-0d798d7370b6 00:21:20.522 [2024-07-15 12:02:33.991426] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:20.522 [2024-07-15 12:02:33.991467] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25ef240 00:21:20.522 [2024-07-15 12:02:33.991475] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:21:20.522 [2024-07-15 12:02:33.991672] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e8c20 00:21:20.522 [2024-07-15 12:02:33.991807] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25ef240 00:21:20.522 [2024-07-15 12:02:33.991817] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25ef240 00:21:20.522 [2024-07-15 12:02:33.991980] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:20.522 NewBaseBdev 00:21:20.522 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:20.522 12:02:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:20.522 12:02:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:20.522 12:02:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:21:20.522 12:02:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:20.522 12:02:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:20.522 12:02:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:20.781 12:02:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:21.040 [ 00:21:21.040 { 00:21:21.040 "name": "NewBaseBdev", 00:21:21.040 "aliases": [ 00:21:21.040 "c496bc2b-4d22-4e2d-9d68-0d798d7370b6" 00:21:21.040 ], 00:21:21.040 "product_name": "Malloc disk", 00:21:21.040 "block_size": 512, 00:21:21.040 "num_blocks": 65536, 00:21:21.040 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:21.040 "assigned_rate_limits": { 00:21:21.040 "rw_ios_per_sec": 0, 00:21:21.040 "rw_mbytes_per_sec": 0, 00:21:21.040 "r_mbytes_per_sec": 0, 00:21:21.040 "w_mbytes_per_sec": 0 00:21:21.040 }, 00:21:21.040 "claimed": true, 00:21:21.040 "claim_type": "exclusive_write", 00:21:21.040 "zoned": false, 00:21:21.040 "supported_io_types": { 00:21:21.040 "read": true, 00:21:21.040 "write": true, 00:21:21.040 "unmap": true, 00:21:21.040 "flush": true, 00:21:21.040 "reset": true, 00:21:21.040 "nvme_admin": false, 00:21:21.040 "nvme_io": false, 00:21:21.040 "nvme_io_md": false, 00:21:21.040 "write_zeroes": true, 00:21:21.040 "zcopy": true, 00:21:21.040 "get_zone_info": false, 00:21:21.040 "zone_management": false, 00:21:21.040 "zone_append": false, 00:21:21.040 "compare": false, 00:21:21.040 "compare_and_write": false, 00:21:21.040 "abort": true, 00:21:21.040 "seek_hole": false, 00:21:21.040 "seek_data": false, 00:21:21.040 "copy": true, 00:21:21.040 "nvme_iov_md": false 00:21:21.040 }, 00:21:21.040 "memory_domains": [ 00:21:21.040 { 00:21:21.040 "dma_device_id": "system", 00:21:21.040 "dma_device_type": 1 00:21:21.040 }, 00:21:21.040 { 00:21:21.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.040 "dma_device_type": 2 00:21:21.040 } 00:21:21.040 ], 00:21:21.040 "driver_specific": {} 00:21:21.040 } 00:21:21.040 ] 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:21.040 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:21.299 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:21.299 "name": "Existed_Raid", 00:21:21.299 "uuid": "02e5f1cd-b9e2-4144-8a85-f2475ba367f5", 00:21:21.299 "strip_size_kb": 64, 00:21:21.299 "state": "online", 00:21:21.299 "raid_level": "concat", 00:21:21.299 "superblock": false, 00:21:21.299 "num_base_bdevs": 4, 00:21:21.299 "num_base_bdevs_discovered": 4, 00:21:21.299 "num_base_bdevs_operational": 4, 00:21:21.299 "base_bdevs_list": [ 00:21:21.299 { 00:21:21.299 "name": "NewBaseBdev", 00:21:21.299 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:21.299 "is_configured": true, 00:21:21.299 "data_offset": 0, 00:21:21.299 "data_size": 65536 00:21:21.299 }, 00:21:21.299 { 00:21:21.299 "name": "BaseBdev2", 00:21:21.299 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:21.299 "is_configured": true, 00:21:21.299 "data_offset": 0, 00:21:21.299 "data_size": 65536 00:21:21.299 }, 00:21:21.299 { 00:21:21.299 "name": "BaseBdev3", 00:21:21.299 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:21.299 "is_configured": true, 00:21:21.299 "data_offset": 0, 00:21:21.299 "data_size": 65536 00:21:21.299 }, 00:21:21.299 { 00:21:21.299 "name": "BaseBdev4", 00:21:21.299 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:21.299 "is_configured": true, 00:21:21.299 "data_offset": 0, 00:21:21.299 "data_size": 65536 00:21:21.299 } 00:21:21.299 ] 00:21:21.299 }' 00:21:21.299 12:02:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:21.299 12:02:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:21.881 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:21.881 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:21.881 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:21.881 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:21.881 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:21.881 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:21.881 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:21.881 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:22.141 [2024-07-15 12:02:35.656189] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:22.141 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:22.141 "name": "Existed_Raid", 00:21:22.141 "aliases": [ 00:21:22.141 "02e5f1cd-b9e2-4144-8a85-f2475ba367f5" 00:21:22.141 ], 00:21:22.141 "product_name": "Raid Volume", 00:21:22.141 "block_size": 512, 00:21:22.141 "num_blocks": 262144, 00:21:22.141 "uuid": "02e5f1cd-b9e2-4144-8a85-f2475ba367f5", 00:21:22.141 "assigned_rate_limits": { 00:21:22.141 "rw_ios_per_sec": 0, 00:21:22.141 "rw_mbytes_per_sec": 0, 00:21:22.141 "r_mbytes_per_sec": 0, 00:21:22.141 "w_mbytes_per_sec": 0 00:21:22.141 }, 00:21:22.141 "claimed": false, 00:21:22.141 "zoned": false, 00:21:22.141 "supported_io_types": { 00:21:22.141 "read": true, 00:21:22.141 "write": true, 00:21:22.141 "unmap": true, 00:21:22.141 "flush": true, 00:21:22.141 "reset": true, 00:21:22.141 "nvme_admin": false, 00:21:22.141 "nvme_io": false, 00:21:22.141 "nvme_io_md": false, 00:21:22.141 "write_zeroes": true, 00:21:22.141 "zcopy": false, 00:21:22.141 "get_zone_info": false, 00:21:22.141 "zone_management": false, 00:21:22.141 "zone_append": false, 00:21:22.141 "compare": false, 00:21:22.141 "compare_and_write": false, 00:21:22.141 "abort": false, 00:21:22.141 "seek_hole": false, 00:21:22.141 "seek_data": false, 00:21:22.141 "copy": false, 00:21:22.141 "nvme_iov_md": false 00:21:22.141 }, 00:21:22.141 "memory_domains": [ 00:21:22.141 { 00:21:22.141 "dma_device_id": "system", 00:21:22.141 "dma_device_type": 1 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.141 "dma_device_type": 2 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "dma_device_id": "system", 00:21:22.141 "dma_device_type": 1 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.141 "dma_device_type": 2 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "dma_device_id": "system", 00:21:22.141 "dma_device_type": 1 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.141 "dma_device_type": 2 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "dma_device_id": "system", 00:21:22.141 "dma_device_type": 1 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.141 "dma_device_type": 2 00:21:22.141 } 00:21:22.141 ], 00:21:22.141 "driver_specific": { 00:21:22.141 "raid": { 00:21:22.141 "uuid": "02e5f1cd-b9e2-4144-8a85-f2475ba367f5", 00:21:22.141 "strip_size_kb": 64, 00:21:22.141 "state": "online", 00:21:22.141 "raid_level": "concat", 00:21:22.141 "superblock": false, 00:21:22.141 "num_base_bdevs": 4, 00:21:22.141 "num_base_bdevs_discovered": 4, 00:21:22.141 "num_base_bdevs_operational": 4, 00:21:22.141 "base_bdevs_list": [ 00:21:22.141 { 00:21:22.141 "name": "NewBaseBdev", 00:21:22.141 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:22.141 "is_configured": true, 00:21:22.141 "data_offset": 0, 00:21:22.141 "data_size": 65536 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "name": "BaseBdev2", 00:21:22.141 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:22.141 "is_configured": true, 00:21:22.141 "data_offset": 0, 00:21:22.141 "data_size": 65536 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "name": "BaseBdev3", 00:21:22.141 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:22.141 "is_configured": true, 00:21:22.141 "data_offset": 0, 00:21:22.141 "data_size": 65536 00:21:22.141 }, 00:21:22.141 { 00:21:22.141 "name": "BaseBdev4", 00:21:22.141 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:22.141 "is_configured": true, 00:21:22.141 "data_offset": 0, 00:21:22.141 "data_size": 65536 00:21:22.141 } 00:21:22.141 ] 00:21:22.141 } 00:21:22.141 } 00:21:22.141 }' 00:21:22.141 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:22.141 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:22.141 BaseBdev2 00:21:22.141 BaseBdev3 00:21:22.141 BaseBdev4' 00:21:22.141 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:22.141 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:22.141 12:02:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.710 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.710 "name": "NewBaseBdev", 00:21:22.710 "aliases": [ 00:21:22.710 "c496bc2b-4d22-4e2d-9d68-0d798d7370b6" 00:21:22.710 ], 00:21:22.710 "product_name": "Malloc disk", 00:21:22.710 "block_size": 512, 00:21:22.710 "num_blocks": 65536, 00:21:22.710 "uuid": "c496bc2b-4d22-4e2d-9d68-0d798d7370b6", 00:21:22.710 "assigned_rate_limits": { 00:21:22.710 "rw_ios_per_sec": 0, 00:21:22.710 "rw_mbytes_per_sec": 0, 00:21:22.711 "r_mbytes_per_sec": 0, 00:21:22.711 "w_mbytes_per_sec": 0 00:21:22.711 }, 00:21:22.711 "claimed": true, 00:21:22.711 "claim_type": "exclusive_write", 00:21:22.711 "zoned": false, 00:21:22.711 "supported_io_types": { 00:21:22.711 "read": true, 00:21:22.711 "write": true, 00:21:22.711 "unmap": true, 00:21:22.711 "flush": true, 00:21:22.711 "reset": true, 00:21:22.711 "nvme_admin": false, 00:21:22.711 "nvme_io": false, 00:21:22.711 "nvme_io_md": false, 00:21:22.711 "write_zeroes": true, 00:21:22.711 "zcopy": true, 00:21:22.711 "get_zone_info": false, 00:21:22.711 "zone_management": false, 00:21:22.711 "zone_append": false, 00:21:22.711 "compare": false, 00:21:22.711 "compare_and_write": false, 00:21:22.711 "abort": true, 00:21:22.711 "seek_hole": false, 00:21:22.711 "seek_data": false, 00:21:22.711 "copy": true, 00:21:22.711 "nvme_iov_md": false 00:21:22.711 }, 00:21:22.711 "memory_domains": [ 00:21:22.711 { 00:21:22.711 "dma_device_id": "system", 00:21:22.711 "dma_device_type": 1 00:21:22.711 }, 00:21:22.711 { 00:21:22.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.711 "dma_device_type": 2 00:21:22.711 } 00:21:22.711 ], 00:21:22.711 "driver_specific": {} 00:21:22.711 }' 00:21:22.711 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.711 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.970 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.970 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.970 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.970 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.970 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.970 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.970 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:22.970 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.230 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.230 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:23.230 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:23.230 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:23.230 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:23.489 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:23.489 "name": "BaseBdev2", 00:21:23.489 "aliases": [ 00:21:23.489 "eefc26a1-85e4-489e-bcf9-42435586a15c" 00:21:23.489 ], 00:21:23.489 "product_name": "Malloc disk", 00:21:23.489 "block_size": 512, 00:21:23.489 "num_blocks": 65536, 00:21:23.489 "uuid": "eefc26a1-85e4-489e-bcf9-42435586a15c", 00:21:23.489 "assigned_rate_limits": { 00:21:23.489 "rw_ios_per_sec": 0, 00:21:23.489 "rw_mbytes_per_sec": 0, 00:21:23.489 "r_mbytes_per_sec": 0, 00:21:23.489 "w_mbytes_per_sec": 0 00:21:23.489 }, 00:21:23.489 "claimed": true, 00:21:23.489 "claim_type": "exclusive_write", 00:21:23.489 "zoned": false, 00:21:23.489 "supported_io_types": { 00:21:23.489 "read": true, 00:21:23.489 "write": true, 00:21:23.489 "unmap": true, 00:21:23.489 "flush": true, 00:21:23.489 "reset": true, 00:21:23.489 "nvme_admin": false, 00:21:23.489 "nvme_io": false, 00:21:23.489 "nvme_io_md": false, 00:21:23.489 "write_zeroes": true, 00:21:23.489 "zcopy": true, 00:21:23.489 "get_zone_info": false, 00:21:23.489 "zone_management": false, 00:21:23.489 "zone_append": false, 00:21:23.489 "compare": false, 00:21:23.489 "compare_and_write": false, 00:21:23.489 "abort": true, 00:21:23.489 "seek_hole": false, 00:21:23.489 "seek_data": false, 00:21:23.489 "copy": true, 00:21:23.489 "nvme_iov_md": false 00:21:23.489 }, 00:21:23.489 "memory_domains": [ 00:21:23.489 { 00:21:23.489 "dma_device_id": "system", 00:21:23.489 "dma_device_type": 1 00:21:23.489 }, 00:21:23.489 { 00:21:23.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.489 "dma_device_type": 2 00:21:23.489 } 00:21:23.489 ], 00:21:23.489 "driver_specific": {} 00:21:23.489 }' 00:21:23.489 12:02:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:23.489 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:23.489 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:23.489 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:23.749 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:24.008 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:24.008 "name": "BaseBdev3", 00:21:24.008 "aliases": [ 00:21:24.008 "7e888f3b-5265-4523-8429-8a94044d8ea6" 00:21:24.008 ], 00:21:24.008 "product_name": "Malloc disk", 00:21:24.008 "block_size": 512, 00:21:24.008 "num_blocks": 65536, 00:21:24.008 "uuid": "7e888f3b-5265-4523-8429-8a94044d8ea6", 00:21:24.008 "assigned_rate_limits": { 00:21:24.008 "rw_ios_per_sec": 0, 00:21:24.008 "rw_mbytes_per_sec": 0, 00:21:24.008 "r_mbytes_per_sec": 0, 00:21:24.008 "w_mbytes_per_sec": 0 00:21:24.008 }, 00:21:24.008 "claimed": true, 00:21:24.008 "claim_type": "exclusive_write", 00:21:24.008 "zoned": false, 00:21:24.008 "supported_io_types": { 00:21:24.008 "read": true, 00:21:24.008 "write": true, 00:21:24.008 "unmap": true, 00:21:24.008 "flush": true, 00:21:24.008 "reset": true, 00:21:24.008 "nvme_admin": false, 00:21:24.008 "nvme_io": false, 00:21:24.008 "nvme_io_md": false, 00:21:24.008 "write_zeroes": true, 00:21:24.008 "zcopy": true, 00:21:24.008 "get_zone_info": false, 00:21:24.008 "zone_management": false, 00:21:24.008 "zone_append": false, 00:21:24.008 "compare": false, 00:21:24.008 "compare_and_write": false, 00:21:24.008 "abort": true, 00:21:24.008 "seek_hole": false, 00:21:24.008 "seek_data": false, 00:21:24.008 "copy": true, 00:21:24.008 "nvme_iov_md": false 00:21:24.008 }, 00:21:24.008 "memory_domains": [ 00:21:24.008 { 00:21:24.008 "dma_device_id": "system", 00:21:24.008 "dma_device_type": 1 00:21:24.008 }, 00:21:24.008 { 00:21:24.008 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.008 "dma_device_type": 2 00:21:24.008 } 00:21:24.008 ], 00:21:24.008 "driver_specific": {} 00:21:24.008 }' 00:21:24.008 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.008 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.267 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:24.267 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.267 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.267 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:24.267 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.267 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.267 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:24.267 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:24.526 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:24.526 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:24.526 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:24.526 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:24.526 12:02:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:24.786 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:24.786 "name": "BaseBdev4", 00:21:24.786 "aliases": [ 00:21:24.786 "36941fe8-080b-48b9-a1b8-701743dfeafd" 00:21:24.786 ], 00:21:24.786 "product_name": "Malloc disk", 00:21:24.786 "block_size": 512, 00:21:24.786 "num_blocks": 65536, 00:21:24.786 "uuid": "36941fe8-080b-48b9-a1b8-701743dfeafd", 00:21:24.786 "assigned_rate_limits": { 00:21:24.786 "rw_ios_per_sec": 0, 00:21:24.786 "rw_mbytes_per_sec": 0, 00:21:24.786 "r_mbytes_per_sec": 0, 00:21:24.786 "w_mbytes_per_sec": 0 00:21:24.786 }, 00:21:24.786 "claimed": true, 00:21:24.786 "claim_type": "exclusive_write", 00:21:24.786 "zoned": false, 00:21:24.786 "supported_io_types": { 00:21:24.786 "read": true, 00:21:24.786 "write": true, 00:21:24.786 "unmap": true, 00:21:24.786 "flush": true, 00:21:24.786 "reset": true, 00:21:24.786 "nvme_admin": false, 00:21:24.786 "nvme_io": false, 00:21:24.786 "nvme_io_md": false, 00:21:24.786 "write_zeroes": true, 00:21:24.786 "zcopy": true, 00:21:24.786 "get_zone_info": false, 00:21:24.786 "zone_management": false, 00:21:24.786 "zone_append": false, 00:21:24.786 "compare": false, 00:21:24.786 "compare_and_write": false, 00:21:24.786 "abort": true, 00:21:24.786 "seek_hole": false, 00:21:24.786 "seek_data": false, 00:21:24.786 "copy": true, 00:21:24.786 "nvme_iov_md": false 00:21:24.786 }, 00:21:24.786 "memory_domains": [ 00:21:24.786 { 00:21:24.786 "dma_device_id": "system", 00:21:24.786 "dma_device_type": 1 00:21:24.786 }, 00:21:24.786 { 00:21:24.786 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.786 "dma_device_type": 2 00:21:24.786 } 00:21:24.786 ], 00:21:24.786 "driver_specific": {} 00:21:24.786 }' 00:21:24.786 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.786 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.786 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:24.786 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.786 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:25.046 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:25.046 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:25.046 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:25.046 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:25.046 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:25.046 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:25.305 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:25.305 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:25.305 [2024-07-15 12:02:38.876404] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:25.305 [2024-07-15 12:02:38.876429] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:25.305 [2024-07-15 12:02:38.876481] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:25.305 [2024-07-15 12:02:38.876541] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:25.305 [2024-07-15 12:02:38.876553] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25ef240 name Existed_Raid, state offline 00:21:25.305 12:02:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1532101 00:21:25.305 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1532101 ']' 00:21:25.305 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1532101 00:21:25.565 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:21:25.565 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:21:25.565 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1532101 00:21:25.565 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:21:25.565 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:21:25.565 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1532101' 00:21:25.565 killing process with pid 1532101 00:21:25.565 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1532101 00:21:25.565 [2024-07-15 12:02:38.948545] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:25.565 12:02:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1532101 00:21:25.565 [2024-07-15 12:02:38.986972] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:21:25.824 00:21:25.824 real 0m32.707s 00:21:25.824 user 0m59.999s 00:21:25.824 sys 0m5.885s 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:21:25.824 ************************************ 00:21:25.824 END TEST raid_state_function_test 00:21:25.824 ************************************ 00:21:25.824 12:02:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:21:25.824 12:02:39 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:21:25.824 12:02:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:21:25.824 12:02:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:21:25.824 12:02:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:25.824 ************************************ 00:21:25.824 START TEST raid_state_function_test_sb 00:21:25.824 ************************************ 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test concat 4 true 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1536978 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1536978' 00:21:25.824 Process raid pid: 1536978 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1536978 /var/tmp/spdk-raid.sock 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1536978 ']' 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:25.824 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:21:25.824 12:02:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:25.824 [2024-07-15 12:02:39.364986] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:21:25.824 [2024-07-15 12:02:39.365039] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:21:26.083 [2024-07-15 12:02:39.477404] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:26.083 [2024-07-15 12:02:39.579932] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:26.083 [2024-07-15 12:02:39.636970] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:26.083 [2024-07-15 12:02:39.636996] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:27.066 [2024-07-15 12:02:40.537873] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:27.066 [2024-07-15 12:02:40.537920] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:27.066 [2024-07-15 12:02:40.537931] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:27.066 [2024-07-15 12:02:40.537943] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:27.066 [2024-07-15 12:02:40.537952] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:27.066 [2024-07-15 12:02:40.537963] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:27.066 [2024-07-15 12:02:40.537975] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:27.066 [2024-07-15 12:02:40.537986] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:27.066 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:27.327 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:27.327 "name": "Existed_Raid", 00:21:27.327 "uuid": "baeb2329-bb10-450f-9161-459a948d6032", 00:21:27.327 "strip_size_kb": 64, 00:21:27.327 "state": "configuring", 00:21:27.327 "raid_level": "concat", 00:21:27.327 "superblock": true, 00:21:27.327 "num_base_bdevs": 4, 00:21:27.327 "num_base_bdevs_discovered": 0, 00:21:27.327 "num_base_bdevs_operational": 4, 00:21:27.327 "base_bdevs_list": [ 00:21:27.327 { 00:21:27.327 "name": "BaseBdev1", 00:21:27.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.327 "is_configured": false, 00:21:27.327 "data_offset": 0, 00:21:27.327 "data_size": 0 00:21:27.327 }, 00:21:27.327 { 00:21:27.327 "name": "BaseBdev2", 00:21:27.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.327 "is_configured": false, 00:21:27.327 "data_offset": 0, 00:21:27.327 "data_size": 0 00:21:27.327 }, 00:21:27.327 { 00:21:27.327 "name": "BaseBdev3", 00:21:27.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.327 "is_configured": false, 00:21:27.327 "data_offset": 0, 00:21:27.327 "data_size": 0 00:21:27.327 }, 00:21:27.327 { 00:21:27.327 "name": "BaseBdev4", 00:21:27.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:27.327 "is_configured": false, 00:21:27.327 "data_offset": 0, 00:21:27.327 "data_size": 0 00:21:27.327 } 00:21:27.327 ] 00:21:27.327 }' 00:21:27.327 12:02:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:27.327 12:02:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:27.895 12:02:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:28.464 [2024-07-15 12:02:41.917467] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:28.464 [2024-07-15 12:02:41.917502] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2489b20 name Existed_Raid, state configuring 00:21:28.464 12:02:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:28.724 [2024-07-15 12:02:42.222297] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:28.724 [2024-07-15 12:02:42.222331] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:28.724 [2024-07-15 12:02:42.222341] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:28.724 [2024-07-15 12:02:42.222353] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:28.724 [2024-07-15 12:02:42.222361] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:28.724 [2024-07-15 12:02:42.222373] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:28.724 [2024-07-15 12:02:42.222381] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:28.724 [2024-07-15 12:02:42.222392] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:28.724 12:02:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:29.292 [2024-07-15 12:02:42.738809] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:29.292 BaseBdev1 00:21:29.292 12:02:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:21:29.293 12:02:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:29.293 12:02:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:29.293 12:02:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:29.293 12:02:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:29.293 12:02:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:29.293 12:02:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:29.552 12:02:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:29.820 [ 00:21:29.820 { 00:21:29.820 "name": "BaseBdev1", 00:21:29.820 "aliases": [ 00:21:29.820 "722fa310-2619-4fa5-bb12-e221c8ef9dd2" 00:21:29.820 ], 00:21:29.820 "product_name": "Malloc disk", 00:21:29.820 "block_size": 512, 00:21:29.820 "num_blocks": 65536, 00:21:29.820 "uuid": "722fa310-2619-4fa5-bb12-e221c8ef9dd2", 00:21:29.820 "assigned_rate_limits": { 00:21:29.820 "rw_ios_per_sec": 0, 00:21:29.820 "rw_mbytes_per_sec": 0, 00:21:29.820 "r_mbytes_per_sec": 0, 00:21:29.820 "w_mbytes_per_sec": 0 00:21:29.820 }, 00:21:29.820 "claimed": true, 00:21:29.820 "claim_type": "exclusive_write", 00:21:29.820 "zoned": false, 00:21:29.820 "supported_io_types": { 00:21:29.820 "read": true, 00:21:29.820 "write": true, 00:21:29.820 "unmap": true, 00:21:29.820 "flush": true, 00:21:29.820 "reset": true, 00:21:29.820 "nvme_admin": false, 00:21:29.820 "nvme_io": false, 00:21:29.820 "nvme_io_md": false, 00:21:29.820 "write_zeroes": true, 00:21:29.820 "zcopy": true, 00:21:29.820 "get_zone_info": false, 00:21:29.820 "zone_management": false, 00:21:29.820 "zone_append": false, 00:21:29.820 "compare": false, 00:21:29.820 "compare_and_write": false, 00:21:29.820 "abort": true, 00:21:29.820 "seek_hole": false, 00:21:29.820 "seek_data": false, 00:21:29.820 "copy": true, 00:21:29.820 "nvme_iov_md": false 00:21:29.820 }, 00:21:29.820 "memory_domains": [ 00:21:29.820 { 00:21:29.820 "dma_device_id": "system", 00:21:29.820 "dma_device_type": 1 00:21:29.820 }, 00:21:29.820 { 00:21:29.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:29.820 "dma_device_type": 2 00:21:29.820 } 00:21:29.820 ], 00:21:29.820 "driver_specific": {} 00:21:29.820 } 00:21:29.820 ] 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:29.820 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:30.392 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.392 "name": "Existed_Raid", 00:21:30.392 "uuid": "367faf05-acfa-4077-8edb-b89b7c6c9789", 00:21:30.392 "strip_size_kb": 64, 00:21:30.392 "state": "configuring", 00:21:30.392 "raid_level": "concat", 00:21:30.392 "superblock": true, 00:21:30.392 "num_base_bdevs": 4, 00:21:30.392 "num_base_bdevs_discovered": 1, 00:21:30.392 "num_base_bdevs_operational": 4, 00:21:30.392 "base_bdevs_list": [ 00:21:30.392 { 00:21:30.392 "name": "BaseBdev1", 00:21:30.392 "uuid": "722fa310-2619-4fa5-bb12-e221c8ef9dd2", 00:21:30.392 "is_configured": true, 00:21:30.392 "data_offset": 2048, 00:21:30.392 "data_size": 63488 00:21:30.392 }, 00:21:30.392 { 00:21:30.392 "name": "BaseBdev2", 00:21:30.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.392 "is_configured": false, 00:21:30.392 "data_offset": 0, 00:21:30.392 "data_size": 0 00:21:30.392 }, 00:21:30.392 { 00:21:30.392 "name": "BaseBdev3", 00:21:30.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.392 "is_configured": false, 00:21:30.392 "data_offset": 0, 00:21:30.392 "data_size": 0 00:21:30.392 }, 00:21:30.392 { 00:21:30.392 "name": "BaseBdev4", 00:21:30.392 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:30.392 "is_configured": false, 00:21:30.392 "data_offset": 0, 00:21:30.392 "data_size": 0 00:21:30.392 } 00:21:30.392 ] 00:21:30.392 }' 00:21:30.392 12:02:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.392 12:02:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:30.959 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:31.217 [2024-07-15 12:02:44.688008] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:31.218 [2024-07-15 12:02:44.688052] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2489390 name Existed_Raid, state configuring 00:21:31.218 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:31.476 [2024-07-15 12:02:44.936720] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:31.476 [2024-07-15 12:02:44.938158] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:31.476 [2024-07-15 12:02:44.938193] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:31.476 [2024-07-15 12:02:44.938204] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:31.476 [2024-07-15 12:02:44.938215] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:31.476 [2024-07-15 12:02:44.938225] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:31.476 [2024-07-15 12:02:44.938236] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:31.476 12:02:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:31.735 12:02:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:31.735 "name": "Existed_Raid", 00:21:31.735 "uuid": "235b06c8-fb19-472d-8819-cf1eafac37e9", 00:21:31.735 "strip_size_kb": 64, 00:21:31.735 "state": "configuring", 00:21:31.735 "raid_level": "concat", 00:21:31.735 "superblock": true, 00:21:31.735 "num_base_bdevs": 4, 00:21:31.735 "num_base_bdevs_discovered": 1, 00:21:31.735 "num_base_bdevs_operational": 4, 00:21:31.735 "base_bdevs_list": [ 00:21:31.735 { 00:21:31.735 "name": "BaseBdev1", 00:21:31.735 "uuid": "722fa310-2619-4fa5-bb12-e221c8ef9dd2", 00:21:31.735 "is_configured": true, 00:21:31.735 "data_offset": 2048, 00:21:31.735 "data_size": 63488 00:21:31.735 }, 00:21:31.735 { 00:21:31.735 "name": "BaseBdev2", 00:21:31.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:31.735 "is_configured": false, 00:21:31.735 "data_offset": 0, 00:21:31.735 "data_size": 0 00:21:31.735 }, 00:21:31.735 { 00:21:31.735 "name": "BaseBdev3", 00:21:31.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:31.735 "is_configured": false, 00:21:31.735 "data_offset": 0, 00:21:31.735 "data_size": 0 00:21:31.735 }, 00:21:31.735 { 00:21:31.735 "name": "BaseBdev4", 00:21:31.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:31.735 "is_configured": false, 00:21:31.735 "data_offset": 0, 00:21:31.735 "data_size": 0 00:21:31.735 } 00:21:31.735 ] 00:21:31.735 }' 00:21:31.736 12:02:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:31.736 12:02:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:32.673 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:32.932 [2024-07-15 12:02:46.327824] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:32.932 BaseBdev2 00:21:32.932 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:32.932 12:02:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:32.932 12:02:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:32.932 12:02:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:32.932 12:02:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:32.932 12:02:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:32.932 12:02:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:33.191 12:02:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:33.450 [ 00:21:33.450 { 00:21:33.450 "name": "BaseBdev2", 00:21:33.450 "aliases": [ 00:21:33.450 "03f89940-af19-475d-8921-30860dd28781" 00:21:33.450 ], 00:21:33.450 "product_name": "Malloc disk", 00:21:33.450 "block_size": 512, 00:21:33.450 "num_blocks": 65536, 00:21:33.450 "uuid": "03f89940-af19-475d-8921-30860dd28781", 00:21:33.450 "assigned_rate_limits": { 00:21:33.450 "rw_ios_per_sec": 0, 00:21:33.450 "rw_mbytes_per_sec": 0, 00:21:33.450 "r_mbytes_per_sec": 0, 00:21:33.450 "w_mbytes_per_sec": 0 00:21:33.450 }, 00:21:33.450 "claimed": true, 00:21:33.450 "claim_type": "exclusive_write", 00:21:33.450 "zoned": false, 00:21:33.450 "supported_io_types": { 00:21:33.450 "read": true, 00:21:33.450 "write": true, 00:21:33.450 "unmap": true, 00:21:33.450 "flush": true, 00:21:33.450 "reset": true, 00:21:33.450 "nvme_admin": false, 00:21:33.450 "nvme_io": false, 00:21:33.450 "nvme_io_md": false, 00:21:33.450 "write_zeroes": true, 00:21:33.450 "zcopy": true, 00:21:33.450 "get_zone_info": false, 00:21:33.450 "zone_management": false, 00:21:33.450 "zone_append": false, 00:21:33.450 "compare": false, 00:21:33.450 "compare_and_write": false, 00:21:33.450 "abort": true, 00:21:33.450 "seek_hole": false, 00:21:33.450 "seek_data": false, 00:21:33.450 "copy": true, 00:21:33.450 "nvme_iov_md": false 00:21:33.450 }, 00:21:33.450 "memory_domains": [ 00:21:33.450 { 00:21:33.450 "dma_device_id": "system", 00:21:33.450 "dma_device_type": 1 00:21:33.450 }, 00:21:33.450 { 00:21:33.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.450 "dma_device_type": 2 00:21:33.450 } 00:21:33.450 ], 00:21:33.450 "driver_specific": {} 00:21:33.450 } 00:21:33.450 ] 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:33.450 12:02:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:33.709 12:02:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:33.709 "name": "Existed_Raid", 00:21:33.709 "uuid": "235b06c8-fb19-472d-8819-cf1eafac37e9", 00:21:33.709 "strip_size_kb": 64, 00:21:33.709 "state": "configuring", 00:21:33.709 "raid_level": "concat", 00:21:33.709 "superblock": true, 00:21:33.709 "num_base_bdevs": 4, 00:21:33.709 "num_base_bdevs_discovered": 2, 00:21:33.709 "num_base_bdevs_operational": 4, 00:21:33.709 "base_bdevs_list": [ 00:21:33.709 { 00:21:33.709 "name": "BaseBdev1", 00:21:33.709 "uuid": "722fa310-2619-4fa5-bb12-e221c8ef9dd2", 00:21:33.709 "is_configured": true, 00:21:33.709 "data_offset": 2048, 00:21:33.709 "data_size": 63488 00:21:33.709 }, 00:21:33.709 { 00:21:33.709 "name": "BaseBdev2", 00:21:33.709 "uuid": "03f89940-af19-475d-8921-30860dd28781", 00:21:33.709 "is_configured": true, 00:21:33.709 "data_offset": 2048, 00:21:33.709 "data_size": 63488 00:21:33.709 }, 00:21:33.709 { 00:21:33.709 "name": "BaseBdev3", 00:21:33.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.709 "is_configured": false, 00:21:33.709 "data_offset": 0, 00:21:33.709 "data_size": 0 00:21:33.709 }, 00:21:33.709 { 00:21:33.709 "name": "BaseBdev4", 00:21:33.709 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:33.709 "is_configured": false, 00:21:33.709 "data_offset": 0, 00:21:33.709 "data_size": 0 00:21:33.709 } 00:21:33.709 ] 00:21:33.709 }' 00:21:33.709 12:02:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:33.709 12:02:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:34.277 12:02:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:34.536 [2024-07-15 12:02:47.959726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:34.536 BaseBdev3 00:21:34.536 12:02:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:34.536 12:02:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:34.536 12:02:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:34.536 12:02:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:34.536 12:02:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:34.536 12:02:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:34.536 12:02:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:34.795 12:02:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:35.055 [ 00:21:35.055 { 00:21:35.055 "name": "BaseBdev3", 00:21:35.055 "aliases": [ 00:21:35.055 "8963e11d-8486-46f7-99b7-15719d69f4e7" 00:21:35.055 ], 00:21:35.055 "product_name": "Malloc disk", 00:21:35.055 "block_size": 512, 00:21:35.055 "num_blocks": 65536, 00:21:35.055 "uuid": "8963e11d-8486-46f7-99b7-15719d69f4e7", 00:21:35.055 "assigned_rate_limits": { 00:21:35.055 "rw_ios_per_sec": 0, 00:21:35.055 "rw_mbytes_per_sec": 0, 00:21:35.055 "r_mbytes_per_sec": 0, 00:21:35.055 "w_mbytes_per_sec": 0 00:21:35.055 }, 00:21:35.055 "claimed": true, 00:21:35.055 "claim_type": "exclusive_write", 00:21:35.055 "zoned": false, 00:21:35.055 "supported_io_types": { 00:21:35.055 "read": true, 00:21:35.055 "write": true, 00:21:35.055 "unmap": true, 00:21:35.055 "flush": true, 00:21:35.055 "reset": true, 00:21:35.055 "nvme_admin": false, 00:21:35.055 "nvme_io": false, 00:21:35.055 "nvme_io_md": false, 00:21:35.055 "write_zeroes": true, 00:21:35.055 "zcopy": true, 00:21:35.055 "get_zone_info": false, 00:21:35.055 "zone_management": false, 00:21:35.055 "zone_append": false, 00:21:35.055 "compare": false, 00:21:35.055 "compare_and_write": false, 00:21:35.055 "abort": true, 00:21:35.055 "seek_hole": false, 00:21:35.055 "seek_data": false, 00:21:35.055 "copy": true, 00:21:35.055 "nvme_iov_md": false 00:21:35.055 }, 00:21:35.055 "memory_domains": [ 00:21:35.055 { 00:21:35.055 "dma_device_id": "system", 00:21:35.055 "dma_device_type": 1 00:21:35.055 }, 00:21:35.055 { 00:21:35.055 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.055 "dma_device_type": 2 00:21:35.055 } 00:21:35.055 ], 00:21:35.055 "driver_specific": {} 00:21:35.055 } 00:21:35.055 ] 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:35.055 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:35.314 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:35.314 "name": "Existed_Raid", 00:21:35.314 "uuid": "235b06c8-fb19-472d-8819-cf1eafac37e9", 00:21:35.314 "strip_size_kb": 64, 00:21:35.314 "state": "configuring", 00:21:35.314 "raid_level": "concat", 00:21:35.314 "superblock": true, 00:21:35.314 "num_base_bdevs": 4, 00:21:35.314 "num_base_bdevs_discovered": 3, 00:21:35.314 "num_base_bdevs_operational": 4, 00:21:35.314 "base_bdevs_list": [ 00:21:35.314 { 00:21:35.314 "name": "BaseBdev1", 00:21:35.314 "uuid": "722fa310-2619-4fa5-bb12-e221c8ef9dd2", 00:21:35.314 "is_configured": true, 00:21:35.314 "data_offset": 2048, 00:21:35.314 "data_size": 63488 00:21:35.314 }, 00:21:35.314 { 00:21:35.314 "name": "BaseBdev2", 00:21:35.314 "uuid": "03f89940-af19-475d-8921-30860dd28781", 00:21:35.314 "is_configured": true, 00:21:35.314 "data_offset": 2048, 00:21:35.314 "data_size": 63488 00:21:35.314 }, 00:21:35.314 { 00:21:35.314 "name": "BaseBdev3", 00:21:35.314 "uuid": "8963e11d-8486-46f7-99b7-15719d69f4e7", 00:21:35.314 "is_configured": true, 00:21:35.314 "data_offset": 2048, 00:21:35.314 "data_size": 63488 00:21:35.314 }, 00:21:35.314 { 00:21:35.314 "name": "BaseBdev4", 00:21:35.314 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:35.314 "is_configured": false, 00:21:35.314 "data_offset": 0, 00:21:35.314 "data_size": 0 00:21:35.314 } 00:21:35.314 ] 00:21:35.314 }' 00:21:35.315 12:02:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:35.315 12:02:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:35.883 12:02:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:36.142 [2024-07-15 12:02:49.547552] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:36.142 [2024-07-15 12:02:49.547753] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x248a4a0 00:21:36.142 [2024-07-15 12:02:49.547773] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:36.142 [2024-07-15 12:02:49.547980] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x248a0a0 00:21:36.142 [2024-07-15 12:02:49.548111] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x248a4a0 00:21:36.142 [2024-07-15 12:02:49.548121] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x248a4a0 00:21:36.142 [2024-07-15 12:02:49.548215] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:36.142 BaseBdev4 00:21:36.142 12:02:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:36.142 12:02:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:36.142 12:02:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:36.142 12:02:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:36.142 12:02:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:36.142 12:02:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:36.142 12:02:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:36.401 12:02:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:36.661 [ 00:21:36.661 { 00:21:36.661 "name": "BaseBdev4", 00:21:36.661 "aliases": [ 00:21:36.661 "7681f7e3-44b9-44a8-8bf7-bfb9c5cef4d8" 00:21:36.661 ], 00:21:36.661 "product_name": "Malloc disk", 00:21:36.661 "block_size": 512, 00:21:36.661 "num_blocks": 65536, 00:21:36.661 "uuid": "7681f7e3-44b9-44a8-8bf7-bfb9c5cef4d8", 00:21:36.661 "assigned_rate_limits": { 00:21:36.661 "rw_ios_per_sec": 0, 00:21:36.661 "rw_mbytes_per_sec": 0, 00:21:36.661 "r_mbytes_per_sec": 0, 00:21:36.661 "w_mbytes_per_sec": 0 00:21:36.661 }, 00:21:36.661 "claimed": true, 00:21:36.661 "claim_type": "exclusive_write", 00:21:36.661 "zoned": false, 00:21:36.661 "supported_io_types": { 00:21:36.661 "read": true, 00:21:36.661 "write": true, 00:21:36.661 "unmap": true, 00:21:36.661 "flush": true, 00:21:36.661 "reset": true, 00:21:36.661 "nvme_admin": false, 00:21:36.661 "nvme_io": false, 00:21:36.661 "nvme_io_md": false, 00:21:36.661 "write_zeroes": true, 00:21:36.661 "zcopy": true, 00:21:36.661 "get_zone_info": false, 00:21:36.661 "zone_management": false, 00:21:36.661 "zone_append": false, 00:21:36.661 "compare": false, 00:21:36.661 "compare_and_write": false, 00:21:36.661 "abort": true, 00:21:36.661 "seek_hole": false, 00:21:36.661 "seek_data": false, 00:21:36.661 "copy": true, 00:21:36.661 "nvme_iov_md": false 00:21:36.661 }, 00:21:36.661 "memory_domains": [ 00:21:36.661 { 00:21:36.661 "dma_device_id": "system", 00:21:36.661 "dma_device_type": 1 00:21:36.661 }, 00:21:36.661 { 00:21:36.661 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:36.661 "dma_device_type": 2 00:21:36.661 } 00:21:36.661 ], 00:21:36.661 "driver_specific": {} 00:21:36.661 } 00:21:36.661 ] 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.661 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:36.921 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.921 "name": "Existed_Raid", 00:21:36.921 "uuid": "235b06c8-fb19-472d-8819-cf1eafac37e9", 00:21:36.921 "strip_size_kb": 64, 00:21:36.921 "state": "online", 00:21:36.921 "raid_level": "concat", 00:21:36.921 "superblock": true, 00:21:36.921 "num_base_bdevs": 4, 00:21:36.921 "num_base_bdevs_discovered": 4, 00:21:36.921 "num_base_bdevs_operational": 4, 00:21:36.921 "base_bdevs_list": [ 00:21:36.921 { 00:21:36.921 "name": "BaseBdev1", 00:21:36.921 "uuid": "722fa310-2619-4fa5-bb12-e221c8ef9dd2", 00:21:36.921 "is_configured": true, 00:21:36.921 "data_offset": 2048, 00:21:36.921 "data_size": 63488 00:21:36.921 }, 00:21:36.921 { 00:21:36.921 "name": "BaseBdev2", 00:21:36.921 "uuid": "03f89940-af19-475d-8921-30860dd28781", 00:21:36.921 "is_configured": true, 00:21:36.921 "data_offset": 2048, 00:21:36.921 "data_size": 63488 00:21:36.921 }, 00:21:36.921 { 00:21:36.921 "name": "BaseBdev3", 00:21:36.921 "uuid": "8963e11d-8486-46f7-99b7-15719d69f4e7", 00:21:36.921 "is_configured": true, 00:21:36.921 "data_offset": 2048, 00:21:36.921 "data_size": 63488 00:21:36.921 }, 00:21:36.921 { 00:21:36.921 "name": "BaseBdev4", 00:21:36.921 "uuid": "7681f7e3-44b9-44a8-8bf7-bfb9c5cef4d8", 00:21:36.921 "is_configured": true, 00:21:36.921 "data_offset": 2048, 00:21:36.921 "data_size": 63488 00:21:36.921 } 00:21:36.921 ] 00:21:36.921 }' 00:21:36.921 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.921 12:02:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:37.487 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:37.487 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:37.487 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:37.487 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:37.487 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:37.487 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:37.487 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:37.487 12:02:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:37.746 [2024-07-15 12:02:51.084160] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:37.746 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:37.746 "name": "Existed_Raid", 00:21:37.746 "aliases": [ 00:21:37.746 "235b06c8-fb19-472d-8819-cf1eafac37e9" 00:21:37.746 ], 00:21:37.746 "product_name": "Raid Volume", 00:21:37.746 "block_size": 512, 00:21:37.746 "num_blocks": 253952, 00:21:37.746 "uuid": "235b06c8-fb19-472d-8819-cf1eafac37e9", 00:21:37.746 "assigned_rate_limits": { 00:21:37.746 "rw_ios_per_sec": 0, 00:21:37.746 "rw_mbytes_per_sec": 0, 00:21:37.746 "r_mbytes_per_sec": 0, 00:21:37.746 "w_mbytes_per_sec": 0 00:21:37.746 }, 00:21:37.746 "claimed": false, 00:21:37.746 "zoned": false, 00:21:37.746 "supported_io_types": { 00:21:37.746 "read": true, 00:21:37.746 "write": true, 00:21:37.746 "unmap": true, 00:21:37.746 "flush": true, 00:21:37.746 "reset": true, 00:21:37.747 "nvme_admin": false, 00:21:37.747 "nvme_io": false, 00:21:37.747 "nvme_io_md": false, 00:21:37.747 "write_zeroes": true, 00:21:37.747 "zcopy": false, 00:21:37.747 "get_zone_info": false, 00:21:37.747 "zone_management": false, 00:21:37.747 "zone_append": false, 00:21:37.747 "compare": false, 00:21:37.747 "compare_and_write": false, 00:21:37.747 "abort": false, 00:21:37.747 "seek_hole": false, 00:21:37.747 "seek_data": false, 00:21:37.747 "copy": false, 00:21:37.747 "nvme_iov_md": false 00:21:37.747 }, 00:21:37.747 "memory_domains": [ 00:21:37.747 { 00:21:37.747 "dma_device_id": "system", 00:21:37.747 "dma_device_type": 1 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.747 "dma_device_type": 2 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "dma_device_id": "system", 00:21:37.747 "dma_device_type": 1 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.747 "dma_device_type": 2 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "dma_device_id": "system", 00:21:37.747 "dma_device_type": 1 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.747 "dma_device_type": 2 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "dma_device_id": "system", 00:21:37.747 "dma_device_type": 1 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:37.747 "dma_device_type": 2 00:21:37.747 } 00:21:37.747 ], 00:21:37.747 "driver_specific": { 00:21:37.747 "raid": { 00:21:37.747 "uuid": "235b06c8-fb19-472d-8819-cf1eafac37e9", 00:21:37.747 "strip_size_kb": 64, 00:21:37.747 "state": "online", 00:21:37.747 "raid_level": "concat", 00:21:37.747 "superblock": true, 00:21:37.747 "num_base_bdevs": 4, 00:21:37.747 "num_base_bdevs_discovered": 4, 00:21:37.747 "num_base_bdevs_operational": 4, 00:21:37.747 "base_bdevs_list": [ 00:21:37.747 { 00:21:37.747 "name": "BaseBdev1", 00:21:37.747 "uuid": "722fa310-2619-4fa5-bb12-e221c8ef9dd2", 00:21:37.747 "is_configured": true, 00:21:37.747 "data_offset": 2048, 00:21:37.747 "data_size": 63488 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "name": "BaseBdev2", 00:21:37.747 "uuid": "03f89940-af19-475d-8921-30860dd28781", 00:21:37.747 "is_configured": true, 00:21:37.747 "data_offset": 2048, 00:21:37.747 "data_size": 63488 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "name": "BaseBdev3", 00:21:37.747 "uuid": "8963e11d-8486-46f7-99b7-15719d69f4e7", 00:21:37.747 "is_configured": true, 00:21:37.747 "data_offset": 2048, 00:21:37.747 "data_size": 63488 00:21:37.747 }, 00:21:37.747 { 00:21:37.747 "name": "BaseBdev4", 00:21:37.747 "uuid": "7681f7e3-44b9-44a8-8bf7-bfb9c5cef4d8", 00:21:37.747 "is_configured": true, 00:21:37.747 "data_offset": 2048, 00:21:37.747 "data_size": 63488 00:21:37.747 } 00:21:37.747 ] 00:21:37.747 } 00:21:37.747 } 00:21:37.747 }' 00:21:37.747 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:37.747 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:37.747 BaseBdev2 00:21:37.747 BaseBdev3 00:21:37.747 BaseBdev4' 00:21:37.747 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:37.747 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:37.747 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:38.006 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:38.006 "name": "BaseBdev1", 00:21:38.006 "aliases": [ 00:21:38.006 "722fa310-2619-4fa5-bb12-e221c8ef9dd2" 00:21:38.006 ], 00:21:38.006 "product_name": "Malloc disk", 00:21:38.006 "block_size": 512, 00:21:38.006 "num_blocks": 65536, 00:21:38.006 "uuid": "722fa310-2619-4fa5-bb12-e221c8ef9dd2", 00:21:38.006 "assigned_rate_limits": { 00:21:38.006 "rw_ios_per_sec": 0, 00:21:38.006 "rw_mbytes_per_sec": 0, 00:21:38.006 "r_mbytes_per_sec": 0, 00:21:38.006 "w_mbytes_per_sec": 0 00:21:38.006 }, 00:21:38.006 "claimed": true, 00:21:38.006 "claim_type": "exclusive_write", 00:21:38.006 "zoned": false, 00:21:38.006 "supported_io_types": { 00:21:38.006 "read": true, 00:21:38.006 "write": true, 00:21:38.006 "unmap": true, 00:21:38.006 "flush": true, 00:21:38.006 "reset": true, 00:21:38.006 "nvme_admin": false, 00:21:38.006 "nvme_io": false, 00:21:38.006 "nvme_io_md": false, 00:21:38.006 "write_zeroes": true, 00:21:38.006 "zcopy": true, 00:21:38.006 "get_zone_info": false, 00:21:38.006 "zone_management": false, 00:21:38.006 "zone_append": false, 00:21:38.006 "compare": false, 00:21:38.006 "compare_and_write": false, 00:21:38.006 "abort": true, 00:21:38.006 "seek_hole": false, 00:21:38.006 "seek_data": false, 00:21:38.006 "copy": true, 00:21:38.006 "nvme_iov_md": false 00:21:38.006 }, 00:21:38.006 "memory_domains": [ 00:21:38.006 { 00:21:38.006 "dma_device_id": "system", 00:21:38.006 "dma_device_type": 1 00:21:38.006 }, 00:21:38.006 { 00:21:38.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.006 "dma_device_type": 2 00:21:38.006 } 00:21:38.006 ], 00:21:38.006 "driver_specific": {} 00:21:38.006 }' 00:21:38.006 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.006 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.006 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:38.006 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.006 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.006 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:38.006 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:38.265 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:38.265 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:38.265 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:38.265 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:38.265 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:38.265 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:38.265 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:38.265 12:02:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:38.524 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:38.524 "name": "BaseBdev2", 00:21:38.524 "aliases": [ 00:21:38.524 "03f89940-af19-475d-8921-30860dd28781" 00:21:38.524 ], 00:21:38.524 "product_name": "Malloc disk", 00:21:38.524 "block_size": 512, 00:21:38.524 "num_blocks": 65536, 00:21:38.524 "uuid": "03f89940-af19-475d-8921-30860dd28781", 00:21:38.524 "assigned_rate_limits": { 00:21:38.524 "rw_ios_per_sec": 0, 00:21:38.524 "rw_mbytes_per_sec": 0, 00:21:38.524 "r_mbytes_per_sec": 0, 00:21:38.524 "w_mbytes_per_sec": 0 00:21:38.524 }, 00:21:38.524 "claimed": true, 00:21:38.524 "claim_type": "exclusive_write", 00:21:38.524 "zoned": false, 00:21:38.524 "supported_io_types": { 00:21:38.524 "read": true, 00:21:38.524 "write": true, 00:21:38.524 "unmap": true, 00:21:38.524 "flush": true, 00:21:38.524 "reset": true, 00:21:38.524 "nvme_admin": false, 00:21:38.524 "nvme_io": false, 00:21:38.524 "nvme_io_md": false, 00:21:38.524 "write_zeroes": true, 00:21:38.524 "zcopy": true, 00:21:38.524 "get_zone_info": false, 00:21:38.524 "zone_management": false, 00:21:38.524 "zone_append": false, 00:21:38.524 "compare": false, 00:21:38.524 "compare_and_write": false, 00:21:38.524 "abort": true, 00:21:38.524 "seek_hole": false, 00:21:38.524 "seek_data": false, 00:21:38.524 "copy": true, 00:21:38.524 "nvme_iov_md": false 00:21:38.524 }, 00:21:38.524 "memory_domains": [ 00:21:38.524 { 00:21:38.524 "dma_device_id": "system", 00:21:38.524 "dma_device_type": 1 00:21:38.524 }, 00:21:38.524 { 00:21:38.524 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:38.524 "dma_device_type": 2 00:21:38.524 } 00:21:38.524 ], 00:21:38.524 "driver_specific": {} 00:21:38.524 }' 00:21:38.524 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.524 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:38.524 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:38.524 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:38.783 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:39.042 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:39.042 "name": "BaseBdev3", 00:21:39.042 "aliases": [ 00:21:39.042 "8963e11d-8486-46f7-99b7-15719d69f4e7" 00:21:39.042 ], 00:21:39.042 "product_name": "Malloc disk", 00:21:39.042 "block_size": 512, 00:21:39.042 "num_blocks": 65536, 00:21:39.042 "uuid": "8963e11d-8486-46f7-99b7-15719d69f4e7", 00:21:39.042 "assigned_rate_limits": { 00:21:39.042 "rw_ios_per_sec": 0, 00:21:39.042 "rw_mbytes_per_sec": 0, 00:21:39.042 "r_mbytes_per_sec": 0, 00:21:39.042 "w_mbytes_per_sec": 0 00:21:39.042 }, 00:21:39.042 "claimed": true, 00:21:39.042 "claim_type": "exclusive_write", 00:21:39.042 "zoned": false, 00:21:39.042 "supported_io_types": { 00:21:39.042 "read": true, 00:21:39.042 "write": true, 00:21:39.042 "unmap": true, 00:21:39.042 "flush": true, 00:21:39.042 "reset": true, 00:21:39.042 "nvme_admin": false, 00:21:39.042 "nvme_io": false, 00:21:39.042 "nvme_io_md": false, 00:21:39.042 "write_zeroes": true, 00:21:39.042 "zcopy": true, 00:21:39.042 "get_zone_info": false, 00:21:39.042 "zone_management": false, 00:21:39.042 "zone_append": false, 00:21:39.042 "compare": false, 00:21:39.042 "compare_and_write": false, 00:21:39.042 "abort": true, 00:21:39.042 "seek_hole": false, 00:21:39.042 "seek_data": false, 00:21:39.042 "copy": true, 00:21:39.042 "nvme_iov_md": false 00:21:39.042 }, 00:21:39.042 "memory_domains": [ 00:21:39.042 { 00:21:39.042 "dma_device_id": "system", 00:21:39.042 "dma_device_type": 1 00:21:39.042 }, 00:21:39.042 { 00:21:39.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:39.042 "dma_device_type": 2 00:21:39.042 } 00:21:39.042 ], 00:21:39.042 "driver_specific": {} 00:21:39.042 }' 00:21:39.042 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.353 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:39.612 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:39.612 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:39.612 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:39.612 12:02:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:39.870 "name": "BaseBdev4", 00:21:39.870 "aliases": [ 00:21:39.870 "7681f7e3-44b9-44a8-8bf7-bfb9c5cef4d8" 00:21:39.870 ], 00:21:39.870 "product_name": "Malloc disk", 00:21:39.870 "block_size": 512, 00:21:39.870 "num_blocks": 65536, 00:21:39.870 "uuid": "7681f7e3-44b9-44a8-8bf7-bfb9c5cef4d8", 00:21:39.870 "assigned_rate_limits": { 00:21:39.870 "rw_ios_per_sec": 0, 00:21:39.870 "rw_mbytes_per_sec": 0, 00:21:39.870 "r_mbytes_per_sec": 0, 00:21:39.870 "w_mbytes_per_sec": 0 00:21:39.870 }, 00:21:39.870 "claimed": true, 00:21:39.870 "claim_type": "exclusive_write", 00:21:39.870 "zoned": false, 00:21:39.870 "supported_io_types": { 00:21:39.870 "read": true, 00:21:39.870 "write": true, 00:21:39.870 "unmap": true, 00:21:39.870 "flush": true, 00:21:39.870 "reset": true, 00:21:39.870 "nvme_admin": false, 00:21:39.870 "nvme_io": false, 00:21:39.870 "nvme_io_md": false, 00:21:39.870 "write_zeroes": true, 00:21:39.870 "zcopy": true, 00:21:39.870 "get_zone_info": false, 00:21:39.870 "zone_management": false, 00:21:39.870 "zone_append": false, 00:21:39.870 "compare": false, 00:21:39.870 "compare_and_write": false, 00:21:39.870 "abort": true, 00:21:39.870 "seek_hole": false, 00:21:39.870 "seek_data": false, 00:21:39.870 "copy": true, 00:21:39.870 "nvme_iov_md": false 00:21:39.870 }, 00:21:39.870 "memory_domains": [ 00:21:39.870 { 00:21:39.870 "dma_device_id": "system", 00:21:39.870 "dma_device_type": 1 00:21:39.870 }, 00:21:39.870 { 00:21:39.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:39.870 "dma_device_type": 2 00:21:39.870 } 00:21:39.870 ], 00:21:39.870 "driver_specific": {} 00:21:39.870 }' 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:39.870 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:40.129 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:40.129 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:40.129 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:40.387 [2024-07-15 12:02:53.775037] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:40.387 [2024-07-15 12:02:53.775071] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:40.387 [2024-07-15 12:02:53.775122] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.387 12:02:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:40.645 12:02:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.645 "name": "Existed_Raid", 00:21:40.645 "uuid": "235b06c8-fb19-472d-8819-cf1eafac37e9", 00:21:40.645 "strip_size_kb": 64, 00:21:40.645 "state": "offline", 00:21:40.645 "raid_level": "concat", 00:21:40.645 "superblock": true, 00:21:40.645 "num_base_bdevs": 4, 00:21:40.645 "num_base_bdevs_discovered": 3, 00:21:40.645 "num_base_bdevs_operational": 3, 00:21:40.645 "base_bdevs_list": [ 00:21:40.645 { 00:21:40.645 "name": null, 00:21:40.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.645 "is_configured": false, 00:21:40.645 "data_offset": 2048, 00:21:40.645 "data_size": 63488 00:21:40.645 }, 00:21:40.645 { 00:21:40.645 "name": "BaseBdev2", 00:21:40.645 "uuid": "03f89940-af19-475d-8921-30860dd28781", 00:21:40.645 "is_configured": true, 00:21:40.645 "data_offset": 2048, 00:21:40.645 "data_size": 63488 00:21:40.645 }, 00:21:40.645 { 00:21:40.645 "name": "BaseBdev3", 00:21:40.645 "uuid": "8963e11d-8486-46f7-99b7-15719d69f4e7", 00:21:40.645 "is_configured": true, 00:21:40.645 "data_offset": 2048, 00:21:40.645 "data_size": 63488 00:21:40.645 }, 00:21:40.645 { 00:21:40.645 "name": "BaseBdev4", 00:21:40.645 "uuid": "7681f7e3-44b9-44a8-8bf7-bfb9c5cef4d8", 00:21:40.645 "is_configured": true, 00:21:40.645 "data_offset": 2048, 00:21:40.645 "data_size": 63488 00:21:40.645 } 00:21:40.645 ] 00:21:40.645 }' 00:21:40.645 12:02:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.645 12:02:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:41.212 12:02:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:41.212 12:02:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:41.212 12:02:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.212 12:02:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:41.471 12:02:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:41.471 12:02:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:41.471 12:02:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:41.471 [2024-07-15 12:02:55.064203] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:41.730 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:41.730 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:41.730 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.730 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:41.989 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:41.990 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:41.990 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:41.990 [2024-07-15 12:02:55.572248] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:42.248 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:42.248 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:42.248 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.248 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:42.507 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:42.507 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:42.507 12:02:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:42.507 [2024-07-15 12:02:56.077933] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:42.507 [2024-07-15 12:02:56.077971] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x248a4a0 name Existed_Raid, state offline 00:21:42.766 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:42.766 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:42.766 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:42.766 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:43.025 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:43.025 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:43.025 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:43.025 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:43.025 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:43.025 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:43.284 BaseBdev2 00:21:43.284 12:02:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:43.284 12:02:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:21:43.284 12:02:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:43.284 12:02:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:43.284 12:02:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:43.284 12:02:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:43.284 12:02:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:43.542 12:02:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:43.542 [ 00:21:43.542 { 00:21:43.542 "name": "BaseBdev2", 00:21:43.542 "aliases": [ 00:21:43.543 "d1ee9f74-3118-493b-bde1-89aaea92ec57" 00:21:43.543 ], 00:21:43.543 "product_name": "Malloc disk", 00:21:43.543 "block_size": 512, 00:21:43.543 "num_blocks": 65536, 00:21:43.543 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:43.543 "assigned_rate_limits": { 00:21:43.543 "rw_ios_per_sec": 0, 00:21:43.543 "rw_mbytes_per_sec": 0, 00:21:43.543 "r_mbytes_per_sec": 0, 00:21:43.543 "w_mbytes_per_sec": 0 00:21:43.543 }, 00:21:43.543 "claimed": false, 00:21:43.543 "zoned": false, 00:21:43.543 "supported_io_types": { 00:21:43.543 "read": true, 00:21:43.543 "write": true, 00:21:43.543 "unmap": true, 00:21:43.543 "flush": true, 00:21:43.543 "reset": true, 00:21:43.543 "nvme_admin": false, 00:21:43.543 "nvme_io": false, 00:21:43.543 "nvme_io_md": false, 00:21:43.543 "write_zeroes": true, 00:21:43.543 "zcopy": true, 00:21:43.543 "get_zone_info": false, 00:21:43.543 "zone_management": false, 00:21:43.543 "zone_append": false, 00:21:43.543 "compare": false, 00:21:43.543 "compare_and_write": false, 00:21:43.543 "abort": true, 00:21:43.543 "seek_hole": false, 00:21:43.543 "seek_data": false, 00:21:43.543 "copy": true, 00:21:43.543 "nvme_iov_md": false 00:21:43.543 }, 00:21:43.543 "memory_domains": [ 00:21:43.543 { 00:21:43.543 "dma_device_id": "system", 00:21:43.543 "dma_device_type": 1 00:21:43.543 }, 00:21:43.543 { 00:21:43.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:43.543 "dma_device_type": 2 00:21:43.543 } 00:21:43.543 ], 00:21:43.543 "driver_specific": {} 00:21:43.543 } 00:21:43.543 ] 00:21:43.543 12:02:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:43.543 12:02:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:43.543 12:02:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:43.543 12:02:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:43.802 BaseBdev3 00:21:43.802 12:02:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:43.802 12:02:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:21:43.802 12:02:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:43.802 12:02:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:43.802 12:02:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:43.802 12:02:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:43.802 12:02:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:44.061 12:02:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:44.320 [ 00:21:44.320 { 00:21:44.320 "name": "BaseBdev3", 00:21:44.320 "aliases": [ 00:21:44.320 "9f02a8cb-8d3a-479e-b419-ad1ede368c30" 00:21:44.320 ], 00:21:44.320 "product_name": "Malloc disk", 00:21:44.320 "block_size": 512, 00:21:44.320 "num_blocks": 65536, 00:21:44.320 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:44.320 "assigned_rate_limits": { 00:21:44.320 "rw_ios_per_sec": 0, 00:21:44.320 "rw_mbytes_per_sec": 0, 00:21:44.320 "r_mbytes_per_sec": 0, 00:21:44.320 "w_mbytes_per_sec": 0 00:21:44.320 }, 00:21:44.320 "claimed": false, 00:21:44.320 "zoned": false, 00:21:44.320 "supported_io_types": { 00:21:44.320 "read": true, 00:21:44.320 "write": true, 00:21:44.320 "unmap": true, 00:21:44.320 "flush": true, 00:21:44.320 "reset": true, 00:21:44.320 "nvme_admin": false, 00:21:44.320 "nvme_io": false, 00:21:44.320 "nvme_io_md": false, 00:21:44.320 "write_zeroes": true, 00:21:44.320 "zcopy": true, 00:21:44.320 "get_zone_info": false, 00:21:44.320 "zone_management": false, 00:21:44.320 "zone_append": false, 00:21:44.320 "compare": false, 00:21:44.320 "compare_and_write": false, 00:21:44.320 "abort": true, 00:21:44.320 "seek_hole": false, 00:21:44.320 "seek_data": false, 00:21:44.320 "copy": true, 00:21:44.320 "nvme_iov_md": false 00:21:44.320 }, 00:21:44.320 "memory_domains": [ 00:21:44.320 { 00:21:44.320 "dma_device_id": "system", 00:21:44.320 "dma_device_type": 1 00:21:44.320 }, 00:21:44.320 { 00:21:44.320 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.320 "dma_device_type": 2 00:21:44.320 } 00:21:44.320 ], 00:21:44.320 "driver_specific": {} 00:21:44.320 } 00:21:44.320 ] 00:21:44.320 12:02:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:44.320 12:02:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:44.320 12:02:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:44.320 12:02:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:44.580 BaseBdev4 00:21:44.580 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:44.580 12:02:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:21:44.580 12:02:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:44.580 12:02:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:44.580 12:02:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:44.580 12:02:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:44.580 12:02:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:44.839 12:02:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:45.098 [ 00:21:45.098 { 00:21:45.098 "name": "BaseBdev4", 00:21:45.098 "aliases": [ 00:21:45.098 "0f46050e-009a-4478-885d-006de1b24c09" 00:21:45.098 ], 00:21:45.098 "product_name": "Malloc disk", 00:21:45.098 "block_size": 512, 00:21:45.098 "num_blocks": 65536, 00:21:45.098 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:45.098 "assigned_rate_limits": { 00:21:45.098 "rw_ios_per_sec": 0, 00:21:45.098 "rw_mbytes_per_sec": 0, 00:21:45.098 "r_mbytes_per_sec": 0, 00:21:45.098 "w_mbytes_per_sec": 0 00:21:45.098 }, 00:21:45.098 "claimed": false, 00:21:45.098 "zoned": false, 00:21:45.098 "supported_io_types": { 00:21:45.098 "read": true, 00:21:45.098 "write": true, 00:21:45.098 "unmap": true, 00:21:45.098 "flush": true, 00:21:45.098 "reset": true, 00:21:45.098 "nvme_admin": false, 00:21:45.098 "nvme_io": false, 00:21:45.098 "nvme_io_md": false, 00:21:45.098 "write_zeroes": true, 00:21:45.098 "zcopy": true, 00:21:45.098 "get_zone_info": false, 00:21:45.098 "zone_management": false, 00:21:45.098 "zone_append": false, 00:21:45.098 "compare": false, 00:21:45.098 "compare_and_write": false, 00:21:45.098 "abort": true, 00:21:45.098 "seek_hole": false, 00:21:45.098 "seek_data": false, 00:21:45.098 "copy": true, 00:21:45.098 "nvme_iov_md": false 00:21:45.098 }, 00:21:45.098 "memory_domains": [ 00:21:45.098 { 00:21:45.098 "dma_device_id": "system", 00:21:45.098 "dma_device_type": 1 00:21:45.098 }, 00:21:45.098 { 00:21:45.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.098 "dma_device_type": 2 00:21:45.098 } 00:21:45.098 ], 00:21:45.098 "driver_specific": {} 00:21:45.098 } 00:21:45.098 ] 00:21:45.098 12:02:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:45.098 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:45.098 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:45.098 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:45.357 [2024-07-15 12:02:58.818361] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:45.357 [2024-07-15 12:02:58.818406] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:45.357 [2024-07-15 12:02:58.818426] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:45.357 [2024-07-15 12:02:58.819829] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:45.357 [2024-07-15 12:02:58.819874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.357 12:02:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:45.617 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.617 "name": "Existed_Raid", 00:21:45.617 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:45.617 "strip_size_kb": 64, 00:21:45.617 "state": "configuring", 00:21:45.617 "raid_level": "concat", 00:21:45.617 "superblock": true, 00:21:45.617 "num_base_bdevs": 4, 00:21:45.617 "num_base_bdevs_discovered": 3, 00:21:45.617 "num_base_bdevs_operational": 4, 00:21:45.617 "base_bdevs_list": [ 00:21:45.617 { 00:21:45.617 "name": "BaseBdev1", 00:21:45.617 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:45.617 "is_configured": false, 00:21:45.617 "data_offset": 0, 00:21:45.617 "data_size": 0 00:21:45.617 }, 00:21:45.617 { 00:21:45.617 "name": "BaseBdev2", 00:21:45.617 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:45.617 "is_configured": true, 00:21:45.617 "data_offset": 2048, 00:21:45.617 "data_size": 63488 00:21:45.617 }, 00:21:45.617 { 00:21:45.617 "name": "BaseBdev3", 00:21:45.617 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:45.617 "is_configured": true, 00:21:45.617 "data_offset": 2048, 00:21:45.617 "data_size": 63488 00:21:45.617 }, 00:21:45.617 { 00:21:45.617 "name": "BaseBdev4", 00:21:45.617 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:45.617 "is_configured": true, 00:21:45.617 "data_offset": 2048, 00:21:45.617 "data_size": 63488 00:21:45.617 } 00:21:45.617 ] 00:21:45.617 }' 00:21:45.617 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.617 12:02:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:46.185 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:46.444 [2024-07-15 12:02:59.905210] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:46.444 12:02:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.704 12:03:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:46.704 "name": "Existed_Raid", 00:21:46.704 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:46.704 "strip_size_kb": 64, 00:21:46.704 "state": "configuring", 00:21:46.704 "raid_level": "concat", 00:21:46.704 "superblock": true, 00:21:46.704 "num_base_bdevs": 4, 00:21:46.704 "num_base_bdevs_discovered": 2, 00:21:46.704 "num_base_bdevs_operational": 4, 00:21:46.704 "base_bdevs_list": [ 00:21:46.704 { 00:21:46.704 "name": "BaseBdev1", 00:21:46.704 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:46.704 "is_configured": false, 00:21:46.704 "data_offset": 0, 00:21:46.704 "data_size": 0 00:21:46.704 }, 00:21:46.704 { 00:21:46.704 "name": null, 00:21:46.704 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:46.704 "is_configured": false, 00:21:46.704 "data_offset": 2048, 00:21:46.704 "data_size": 63488 00:21:46.704 }, 00:21:46.704 { 00:21:46.704 "name": "BaseBdev3", 00:21:46.704 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:46.704 "is_configured": true, 00:21:46.704 "data_offset": 2048, 00:21:46.704 "data_size": 63488 00:21:46.704 }, 00:21:46.704 { 00:21:46.704 "name": "BaseBdev4", 00:21:46.704 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:46.704 "is_configured": true, 00:21:46.704 "data_offset": 2048, 00:21:46.704 "data_size": 63488 00:21:46.704 } 00:21:46.704 ] 00:21:46.704 }' 00:21:46.704 12:03:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:46.704 12:03:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:47.274 12:03:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.274 12:03:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:47.533 12:03:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:47.533 12:03:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:48.102 [2024-07-15 12:03:01.453874] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:48.102 BaseBdev1 00:21:48.102 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:48.102 12:03:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:21:48.102 12:03:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:48.102 12:03:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:48.102 12:03:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:48.102 12:03:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:48.102 12:03:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:48.362 12:03:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:48.621 [ 00:21:48.621 { 00:21:48.621 "name": "BaseBdev1", 00:21:48.621 "aliases": [ 00:21:48.621 "aeccc50a-fd74-474c-85de-84168baf681f" 00:21:48.621 ], 00:21:48.621 "product_name": "Malloc disk", 00:21:48.621 "block_size": 512, 00:21:48.621 "num_blocks": 65536, 00:21:48.621 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:48.621 "assigned_rate_limits": { 00:21:48.621 "rw_ios_per_sec": 0, 00:21:48.621 "rw_mbytes_per_sec": 0, 00:21:48.621 "r_mbytes_per_sec": 0, 00:21:48.621 "w_mbytes_per_sec": 0 00:21:48.621 }, 00:21:48.621 "claimed": true, 00:21:48.621 "claim_type": "exclusive_write", 00:21:48.621 "zoned": false, 00:21:48.621 "supported_io_types": { 00:21:48.621 "read": true, 00:21:48.621 "write": true, 00:21:48.621 "unmap": true, 00:21:48.621 "flush": true, 00:21:48.621 "reset": true, 00:21:48.621 "nvme_admin": false, 00:21:48.621 "nvme_io": false, 00:21:48.621 "nvme_io_md": false, 00:21:48.621 "write_zeroes": true, 00:21:48.621 "zcopy": true, 00:21:48.621 "get_zone_info": false, 00:21:48.621 "zone_management": false, 00:21:48.621 "zone_append": false, 00:21:48.621 "compare": false, 00:21:48.621 "compare_and_write": false, 00:21:48.621 "abort": true, 00:21:48.621 "seek_hole": false, 00:21:48.621 "seek_data": false, 00:21:48.621 "copy": true, 00:21:48.621 "nvme_iov_md": false 00:21:48.621 }, 00:21:48.621 "memory_domains": [ 00:21:48.621 { 00:21:48.621 "dma_device_id": "system", 00:21:48.622 "dma_device_type": 1 00:21:48.622 }, 00:21:48.622 { 00:21:48.622 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:48.622 "dma_device_type": 2 00:21:48.622 } 00:21:48.622 ], 00:21:48.622 "driver_specific": {} 00:21:48.622 } 00:21:48.622 ] 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.622 12:03:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:49.190 12:03:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.190 "name": "Existed_Raid", 00:21:49.190 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:49.190 "strip_size_kb": 64, 00:21:49.190 "state": "configuring", 00:21:49.190 "raid_level": "concat", 00:21:49.190 "superblock": true, 00:21:49.190 "num_base_bdevs": 4, 00:21:49.190 "num_base_bdevs_discovered": 3, 00:21:49.190 "num_base_bdevs_operational": 4, 00:21:49.190 "base_bdevs_list": [ 00:21:49.190 { 00:21:49.190 "name": "BaseBdev1", 00:21:49.190 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:49.190 "is_configured": true, 00:21:49.190 "data_offset": 2048, 00:21:49.190 "data_size": 63488 00:21:49.190 }, 00:21:49.190 { 00:21:49.190 "name": null, 00:21:49.190 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:49.190 "is_configured": false, 00:21:49.190 "data_offset": 2048, 00:21:49.190 "data_size": 63488 00:21:49.190 }, 00:21:49.190 { 00:21:49.191 "name": "BaseBdev3", 00:21:49.191 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:49.191 "is_configured": true, 00:21:49.191 "data_offset": 2048, 00:21:49.191 "data_size": 63488 00:21:49.191 }, 00:21:49.191 { 00:21:49.191 "name": "BaseBdev4", 00:21:49.191 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:49.191 "is_configured": true, 00:21:49.191 "data_offset": 2048, 00:21:49.191 "data_size": 63488 00:21:49.191 } 00:21:49.191 ] 00:21:49.191 }' 00:21:49.191 12:03:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.191 12:03:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:49.770 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.770 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:49.770 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:49.770 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:50.341 [2024-07-15 12:03:03.812203] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.341 12:03:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:50.963 12:03:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.963 "name": "Existed_Raid", 00:21:50.963 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:50.963 "strip_size_kb": 64, 00:21:50.963 "state": "configuring", 00:21:50.963 "raid_level": "concat", 00:21:50.963 "superblock": true, 00:21:50.963 "num_base_bdevs": 4, 00:21:50.963 "num_base_bdevs_discovered": 2, 00:21:50.963 "num_base_bdevs_operational": 4, 00:21:50.963 "base_bdevs_list": [ 00:21:50.963 { 00:21:50.963 "name": "BaseBdev1", 00:21:50.963 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:50.963 "is_configured": true, 00:21:50.963 "data_offset": 2048, 00:21:50.963 "data_size": 63488 00:21:50.963 }, 00:21:50.963 { 00:21:50.963 "name": null, 00:21:50.963 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:50.963 "is_configured": false, 00:21:50.963 "data_offset": 2048, 00:21:50.963 "data_size": 63488 00:21:50.963 }, 00:21:50.963 { 00:21:50.963 "name": null, 00:21:50.963 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:50.964 "is_configured": false, 00:21:50.964 "data_offset": 2048, 00:21:50.964 "data_size": 63488 00:21:50.964 }, 00:21:50.964 { 00:21:50.964 "name": "BaseBdev4", 00:21:50.964 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:50.964 "is_configured": true, 00:21:50.964 "data_offset": 2048, 00:21:50.964 "data_size": 63488 00:21:50.964 } 00:21:50.964 ] 00:21:50.964 }' 00:21:50.964 12:03:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.964 12:03:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:51.543 12:03:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.543 12:03:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:51.802 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:51.802 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:52.061 [2024-07-15 12:03:05.444529] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.061 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:52.320 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.320 "name": "Existed_Raid", 00:21:52.320 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:52.320 "strip_size_kb": 64, 00:21:52.320 "state": "configuring", 00:21:52.320 "raid_level": "concat", 00:21:52.320 "superblock": true, 00:21:52.320 "num_base_bdevs": 4, 00:21:52.320 "num_base_bdevs_discovered": 3, 00:21:52.320 "num_base_bdevs_operational": 4, 00:21:52.320 "base_bdevs_list": [ 00:21:52.320 { 00:21:52.320 "name": "BaseBdev1", 00:21:52.320 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:52.320 "is_configured": true, 00:21:52.320 "data_offset": 2048, 00:21:52.320 "data_size": 63488 00:21:52.321 }, 00:21:52.321 { 00:21:52.321 "name": null, 00:21:52.321 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:52.321 "is_configured": false, 00:21:52.321 "data_offset": 2048, 00:21:52.321 "data_size": 63488 00:21:52.321 }, 00:21:52.321 { 00:21:52.321 "name": "BaseBdev3", 00:21:52.321 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:52.321 "is_configured": true, 00:21:52.321 "data_offset": 2048, 00:21:52.321 "data_size": 63488 00:21:52.321 }, 00:21:52.321 { 00:21:52.321 "name": "BaseBdev4", 00:21:52.321 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:52.321 "is_configured": true, 00:21:52.321 "data_offset": 2048, 00:21:52.321 "data_size": 63488 00:21:52.321 } 00:21:52.321 ] 00:21:52.321 }' 00:21:52.321 12:03:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.321 12:03:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:53.258 12:03:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.258 12:03:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:53.258 12:03:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:53.258 12:03:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:53.518 [2024-07-15 12:03:07.060826] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.518 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:53.777 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:53.777 "name": "Existed_Raid", 00:21:53.777 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:53.777 "strip_size_kb": 64, 00:21:53.777 "state": "configuring", 00:21:53.777 "raid_level": "concat", 00:21:53.777 "superblock": true, 00:21:53.777 "num_base_bdevs": 4, 00:21:53.777 "num_base_bdevs_discovered": 2, 00:21:53.777 "num_base_bdevs_operational": 4, 00:21:53.777 "base_bdevs_list": [ 00:21:53.777 { 00:21:53.777 "name": null, 00:21:53.777 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:53.777 "is_configured": false, 00:21:53.778 "data_offset": 2048, 00:21:53.778 "data_size": 63488 00:21:53.778 }, 00:21:53.778 { 00:21:53.778 "name": null, 00:21:53.778 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:53.778 "is_configured": false, 00:21:53.778 "data_offset": 2048, 00:21:53.778 "data_size": 63488 00:21:53.778 }, 00:21:53.778 { 00:21:53.778 "name": "BaseBdev3", 00:21:53.778 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:53.778 "is_configured": true, 00:21:53.778 "data_offset": 2048, 00:21:53.778 "data_size": 63488 00:21:53.778 }, 00:21:53.778 { 00:21:53.778 "name": "BaseBdev4", 00:21:53.778 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:53.778 "is_configured": true, 00:21:53.778 "data_offset": 2048, 00:21:53.778 "data_size": 63488 00:21:53.778 } 00:21:53.778 ] 00:21:53.778 }' 00:21:53.778 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:53.778 12:03:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:54.345 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.345 12:03:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:54.605 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:54.605 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:54.864 [2024-07-15 12:03:08.384095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:54.864 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:55.123 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:55.123 "name": "Existed_Raid", 00:21:55.123 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:55.123 "strip_size_kb": 64, 00:21:55.123 "state": "configuring", 00:21:55.123 "raid_level": "concat", 00:21:55.123 "superblock": true, 00:21:55.123 "num_base_bdevs": 4, 00:21:55.123 "num_base_bdevs_discovered": 3, 00:21:55.123 "num_base_bdevs_operational": 4, 00:21:55.123 "base_bdevs_list": [ 00:21:55.123 { 00:21:55.123 "name": null, 00:21:55.123 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:55.123 "is_configured": false, 00:21:55.123 "data_offset": 2048, 00:21:55.123 "data_size": 63488 00:21:55.123 }, 00:21:55.123 { 00:21:55.123 "name": "BaseBdev2", 00:21:55.123 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:55.123 "is_configured": true, 00:21:55.123 "data_offset": 2048, 00:21:55.123 "data_size": 63488 00:21:55.123 }, 00:21:55.123 { 00:21:55.123 "name": "BaseBdev3", 00:21:55.123 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:55.123 "is_configured": true, 00:21:55.123 "data_offset": 2048, 00:21:55.123 "data_size": 63488 00:21:55.123 }, 00:21:55.123 { 00:21:55.123 "name": "BaseBdev4", 00:21:55.123 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:55.123 "is_configured": true, 00:21:55.123 "data_offset": 2048, 00:21:55.123 "data_size": 63488 00:21:55.123 } 00:21:55.123 ] 00:21:55.123 }' 00:21:55.123 12:03:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:55.123 12:03:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:55.691 12:03:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.691 12:03:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:55.950 12:03:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:55.950 12:03:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:55.950 12:03:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:56.208 12:03:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u aeccc50a-fd74-474c-85de-84168baf681f 00:21:56.467 [2024-07-15 12:03:09.987796] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:56.467 [2024-07-15 12:03:09.987954] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x248b120 00:21:56.467 [2024-07-15 12:03:09.987967] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:21:56.467 [2024-07-15 12:03:09.988154] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2489af0 00:21:56.467 [2024-07-15 12:03:09.988272] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x248b120 00:21:56.467 [2024-07-15 12:03:09.988283] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x248b120 00:21:56.467 [2024-07-15 12:03:09.988375] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:56.467 NewBaseBdev 00:21:56.467 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:56.467 12:03:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:21:56.467 12:03:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:21:56.467 12:03:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:21:56.467 12:03:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:21:56.467 12:03:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:21:56.467 12:03:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:56.724 12:03:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:56.982 [ 00:21:56.982 { 00:21:56.982 "name": "NewBaseBdev", 00:21:56.982 "aliases": [ 00:21:56.982 "aeccc50a-fd74-474c-85de-84168baf681f" 00:21:56.982 ], 00:21:56.982 "product_name": "Malloc disk", 00:21:56.982 "block_size": 512, 00:21:56.982 "num_blocks": 65536, 00:21:56.982 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:56.982 "assigned_rate_limits": { 00:21:56.982 "rw_ios_per_sec": 0, 00:21:56.982 "rw_mbytes_per_sec": 0, 00:21:56.982 "r_mbytes_per_sec": 0, 00:21:56.982 "w_mbytes_per_sec": 0 00:21:56.982 }, 00:21:56.982 "claimed": true, 00:21:56.982 "claim_type": "exclusive_write", 00:21:56.982 "zoned": false, 00:21:56.982 "supported_io_types": { 00:21:56.982 "read": true, 00:21:56.982 "write": true, 00:21:56.982 "unmap": true, 00:21:56.982 "flush": true, 00:21:56.982 "reset": true, 00:21:56.982 "nvme_admin": false, 00:21:56.982 "nvme_io": false, 00:21:56.982 "nvme_io_md": false, 00:21:56.982 "write_zeroes": true, 00:21:56.982 "zcopy": true, 00:21:56.982 "get_zone_info": false, 00:21:56.982 "zone_management": false, 00:21:56.982 "zone_append": false, 00:21:56.982 "compare": false, 00:21:56.982 "compare_and_write": false, 00:21:56.982 "abort": true, 00:21:56.982 "seek_hole": false, 00:21:56.982 "seek_data": false, 00:21:56.982 "copy": true, 00:21:56.982 "nvme_iov_md": false 00:21:56.982 }, 00:21:56.982 "memory_domains": [ 00:21:56.982 { 00:21:56.982 "dma_device_id": "system", 00:21:56.982 "dma_device_type": 1 00:21:56.982 }, 00:21:56.982 { 00:21:56.982 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:56.982 "dma_device_type": 2 00:21:56.982 } 00:21:56.982 ], 00:21:56.982 "driver_specific": {} 00:21:56.982 } 00:21:56.982 ] 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:56.982 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:57.240 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:57.240 "name": "Existed_Raid", 00:21:57.240 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:57.240 "strip_size_kb": 64, 00:21:57.240 "state": "online", 00:21:57.240 "raid_level": "concat", 00:21:57.240 "superblock": true, 00:21:57.240 "num_base_bdevs": 4, 00:21:57.240 "num_base_bdevs_discovered": 4, 00:21:57.240 "num_base_bdevs_operational": 4, 00:21:57.240 "base_bdevs_list": [ 00:21:57.240 { 00:21:57.240 "name": "NewBaseBdev", 00:21:57.240 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:57.240 "is_configured": true, 00:21:57.240 "data_offset": 2048, 00:21:57.240 "data_size": 63488 00:21:57.240 }, 00:21:57.240 { 00:21:57.240 "name": "BaseBdev2", 00:21:57.240 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:57.240 "is_configured": true, 00:21:57.240 "data_offset": 2048, 00:21:57.240 "data_size": 63488 00:21:57.240 }, 00:21:57.240 { 00:21:57.240 "name": "BaseBdev3", 00:21:57.240 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:57.240 "is_configured": true, 00:21:57.240 "data_offset": 2048, 00:21:57.240 "data_size": 63488 00:21:57.240 }, 00:21:57.240 { 00:21:57.240 "name": "BaseBdev4", 00:21:57.240 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:57.240 "is_configured": true, 00:21:57.240 "data_offset": 2048, 00:21:57.240 "data_size": 63488 00:21:57.240 } 00:21:57.240 ] 00:21:57.240 }' 00:21:57.240 12:03:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:57.240 12:03:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:57.806 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:57.806 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:57.806 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:57.806 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:57.806 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:57.806 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:57.806 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:57.806 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:58.065 [2024-07-15 12:03:11.592394] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:58.065 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:58.065 "name": "Existed_Raid", 00:21:58.065 "aliases": [ 00:21:58.065 "107736d8-4f96-4ba8-926c-87369855819c" 00:21:58.065 ], 00:21:58.065 "product_name": "Raid Volume", 00:21:58.065 "block_size": 512, 00:21:58.065 "num_blocks": 253952, 00:21:58.065 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:58.065 "assigned_rate_limits": { 00:21:58.065 "rw_ios_per_sec": 0, 00:21:58.065 "rw_mbytes_per_sec": 0, 00:21:58.065 "r_mbytes_per_sec": 0, 00:21:58.065 "w_mbytes_per_sec": 0 00:21:58.065 }, 00:21:58.065 "claimed": false, 00:21:58.065 "zoned": false, 00:21:58.065 "supported_io_types": { 00:21:58.065 "read": true, 00:21:58.065 "write": true, 00:21:58.065 "unmap": true, 00:21:58.065 "flush": true, 00:21:58.065 "reset": true, 00:21:58.065 "nvme_admin": false, 00:21:58.065 "nvme_io": false, 00:21:58.065 "nvme_io_md": false, 00:21:58.065 "write_zeroes": true, 00:21:58.065 "zcopy": false, 00:21:58.065 "get_zone_info": false, 00:21:58.065 "zone_management": false, 00:21:58.065 "zone_append": false, 00:21:58.065 "compare": false, 00:21:58.065 "compare_and_write": false, 00:21:58.065 "abort": false, 00:21:58.065 "seek_hole": false, 00:21:58.065 "seek_data": false, 00:21:58.065 "copy": false, 00:21:58.065 "nvme_iov_md": false 00:21:58.065 }, 00:21:58.065 "memory_domains": [ 00:21:58.065 { 00:21:58.065 "dma_device_id": "system", 00:21:58.065 "dma_device_type": 1 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.065 "dma_device_type": 2 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "dma_device_id": "system", 00:21:58.065 "dma_device_type": 1 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.065 "dma_device_type": 2 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "dma_device_id": "system", 00:21:58.065 "dma_device_type": 1 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.065 "dma_device_type": 2 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "dma_device_id": "system", 00:21:58.065 "dma_device_type": 1 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.065 "dma_device_type": 2 00:21:58.065 } 00:21:58.065 ], 00:21:58.065 "driver_specific": { 00:21:58.065 "raid": { 00:21:58.065 "uuid": "107736d8-4f96-4ba8-926c-87369855819c", 00:21:58.065 "strip_size_kb": 64, 00:21:58.065 "state": "online", 00:21:58.065 "raid_level": "concat", 00:21:58.065 "superblock": true, 00:21:58.065 "num_base_bdevs": 4, 00:21:58.065 "num_base_bdevs_discovered": 4, 00:21:58.065 "num_base_bdevs_operational": 4, 00:21:58.065 "base_bdevs_list": [ 00:21:58.065 { 00:21:58.065 "name": "NewBaseBdev", 00:21:58.065 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:58.065 "is_configured": true, 00:21:58.065 "data_offset": 2048, 00:21:58.065 "data_size": 63488 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "name": "BaseBdev2", 00:21:58.065 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:58.065 "is_configured": true, 00:21:58.065 "data_offset": 2048, 00:21:58.065 "data_size": 63488 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "name": "BaseBdev3", 00:21:58.065 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:58.065 "is_configured": true, 00:21:58.065 "data_offset": 2048, 00:21:58.065 "data_size": 63488 00:21:58.065 }, 00:21:58.065 { 00:21:58.065 "name": "BaseBdev4", 00:21:58.065 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:21:58.065 "is_configured": true, 00:21:58.065 "data_offset": 2048, 00:21:58.065 "data_size": 63488 00:21:58.065 } 00:21:58.065 ] 00:21:58.065 } 00:21:58.065 } 00:21:58.065 }' 00:21:58.065 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:58.324 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:58.324 BaseBdev2 00:21:58.324 BaseBdev3 00:21:58.324 BaseBdev4' 00:21:58.324 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:58.324 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:58.324 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:58.324 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:58.324 "name": "NewBaseBdev", 00:21:58.324 "aliases": [ 00:21:58.324 "aeccc50a-fd74-474c-85de-84168baf681f" 00:21:58.324 ], 00:21:58.324 "product_name": "Malloc disk", 00:21:58.324 "block_size": 512, 00:21:58.324 "num_blocks": 65536, 00:21:58.324 "uuid": "aeccc50a-fd74-474c-85de-84168baf681f", 00:21:58.324 "assigned_rate_limits": { 00:21:58.324 "rw_ios_per_sec": 0, 00:21:58.324 "rw_mbytes_per_sec": 0, 00:21:58.324 "r_mbytes_per_sec": 0, 00:21:58.324 "w_mbytes_per_sec": 0 00:21:58.324 }, 00:21:58.324 "claimed": true, 00:21:58.324 "claim_type": "exclusive_write", 00:21:58.324 "zoned": false, 00:21:58.324 "supported_io_types": { 00:21:58.324 "read": true, 00:21:58.324 "write": true, 00:21:58.325 "unmap": true, 00:21:58.325 "flush": true, 00:21:58.325 "reset": true, 00:21:58.325 "nvme_admin": false, 00:21:58.325 "nvme_io": false, 00:21:58.325 "nvme_io_md": false, 00:21:58.325 "write_zeroes": true, 00:21:58.325 "zcopy": true, 00:21:58.325 "get_zone_info": false, 00:21:58.325 "zone_management": false, 00:21:58.325 "zone_append": false, 00:21:58.325 "compare": false, 00:21:58.325 "compare_and_write": false, 00:21:58.325 "abort": true, 00:21:58.325 "seek_hole": false, 00:21:58.325 "seek_data": false, 00:21:58.325 "copy": true, 00:21:58.325 "nvme_iov_md": false 00:21:58.325 }, 00:21:58.325 "memory_domains": [ 00:21:58.325 { 00:21:58.325 "dma_device_id": "system", 00:21:58.325 "dma_device_type": 1 00:21:58.325 }, 00:21:58.325 { 00:21:58.325 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:58.325 "dma_device_type": 2 00:21:58.325 } 00:21:58.325 ], 00:21:58.325 "driver_specific": {} 00:21:58.325 }' 00:21:58.325 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:58.584 12:03:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:58.584 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:58.584 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:58.584 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:58.584 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:58.584 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:58.584 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:58.843 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:58.843 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:58.843 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:58.843 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:58.843 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:58.843 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:58.843 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:59.102 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:59.102 "name": "BaseBdev2", 00:21:59.102 "aliases": [ 00:21:59.102 "d1ee9f74-3118-493b-bde1-89aaea92ec57" 00:21:59.102 ], 00:21:59.102 "product_name": "Malloc disk", 00:21:59.102 "block_size": 512, 00:21:59.102 "num_blocks": 65536, 00:21:59.102 "uuid": "d1ee9f74-3118-493b-bde1-89aaea92ec57", 00:21:59.102 "assigned_rate_limits": { 00:21:59.102 "rw_ios_per_sec": 0, 00:21:59.102 "rw_mbytes_per_sec": 0, 00:21:59.102 "r_mbytes_per_sec": 0, 00:21:59.102 "w_mbytes_per_sec": 0 00:21:59.102 }, 00:21:59.102 "claimed": true, 00:21:59.102 "claim_type": "exclusive_write", 00:21:59.102 "zoned": false, 00:21:59.102 "supported_io_types": { 00:21:59.102 "read": true, 00:21:59.102 "write": true, 00:21:59.102 "unmap": true, 00:21:59.102 "flush": true, 00:21:59.102 "reset": true, 00:21:59.102 "nvme_admin": false, 00:21:59.102 "nvme_io": false, 00:21:59.102 "nvme_io_md": false, 00:21:59.102 "write_zeroes": true, 00:21:59.102 "zcopy": true, 00:21:59.102 "get_zone_info": false, 00:21:59.102 "zone_management": false, 00:21:59.102 "zone_append": false, 00:21:59.102 "compare": false, 00:21:59.102 "compare_and_write": false, 00:21:59.102 "abort": true, 00:21:59.102 "seek_hole": false, 00:21:59.102 "seek_data": false, 00:21:59.102 "copy": true, 00:21:59.102 "nvme_iov_md": false 00:21:59.102 }, 00:21:59.102 "memory_domains": [ 00:21:59.102 { 00:21:59.102 "dma_device_id": "system", 00:21:59.102 "dma_device_type": 1 00:21:59.102 }, 00:21:59.102 { 00:21:59.102 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:59.102 "dma_device_type": 2 00:21:59.102 } 00:21:59.102 ], 00:21:59.102 "driver_specific": {} 00:21:59.102 }' 00:21:59.102 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:59.102 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:59.102 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:59.102 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:59.102 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:59.360 12:03:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:59.619 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:59.619 "name": "BaseBdev3", 00:21:59.619 "aliases": [ 00:21:59.619 "9f02a8cb-8d3a-479e-b419-ad1ede368c30" 00:21:59.619 ], 00:21:59.619 "product_name": "Malloc disk", 00:21:59.619 "block_size": 512, 00:21:59.619 "num_blocks": 65536, 00:21:59.619 "uuid": "9f02a8cb-8d3a-479e-b419-ad1ede368c30", 00:21:59.619 "assigned_rate_limits": { 00:21:59.619 "rw_ios_per_sec": 0, 00:21:59.619 "rw_mbytes_per_sec": 0, 00:21:59.619 "r_mbytes_per_sec": 0, 00:21:59.619 "w_mbytes_per_sec": 0 00:21:59.619 }, 00:21:59.619 "claimed": true, 00:21:59.619 "claim_type": "exclusive_write", 00:21:59.619 "zoned": false, 00:21:59.619 "supported_io_types": { 00:21:59.619 "read": true, 00:21:59.619 "write": true, 00:21:59.619 "unmap": true, 00:21:59.619 "flush": true, 00:21:59.619 "reset": true, 00:21:59.619 "nvme_admin": false, 00:21:59.619 "nvme_io": false, 00:21:59.619 "nvme_io_md": false, 00:21:59.619 "write_zeroes": true, 00:21:59.619 "zcopy": true, 00:21:59.619 "get_zone_info": false, 00:21:59.619 "zone_management": false, 00:21:59.619 "zone_append": false, 00:21:59.619 "compare": false, 00:21:59.619 "compare_and_write": false, 00:21:59.619 "abort": true, 00:21:59.619 "seek_hole": false, 00:21:59.619 "seek_data": false, 00:21:59.619 "copy": true, 00:21:59.619 "nvme_iov_md": false 00:21:59.619 }, 00:21:59.619 "memory_domains": [ 00:21:59.619 { 00:21:59.619 "dma_device_id": "system", 00:21:59.619 "dma_device_type": 1 00:21:59.619 }, 00:21:59.619 { 00:21:59.619 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:59.619 "dma_device_type": 2 00:21:59.619 } 00:21:59.619 ], 00:21:59.619 "driver_specific": {} 00:21:59.619 }' 00:21:59.619 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:59.619 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:59.619 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:59.619 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:59.878 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:59.878 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:59.878 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.878 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:59.878 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:59.878 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:59.878 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:00.136 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:00.136 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:00.136 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:00.136 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:00.136 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:00.136 "name": "BaseBdev4", 00:22:00.136 "aliases": [ 00:22:00.136 "0f46050e-009a-4478-885d-006de1b24c09" 00:22:00.136 ], 00:22:00.136 "product_name": "Malloc disk", 00:22:00.136 "block_size": 512, 00:22:00.136 "num_blocks": 65536, 00:22:00.136 "uuid": "0f46050e-009a-4478-885d-006de1b24c09", 00:22:00.136 "assigned_rate_limits": { 00:22:00.136 "rw_ios_per_sec": 0, 00:22:00.136 "rw_mbytes_per_sec": 0, 00:22:00.136 "r_mbytes_per_sec": 0, 00:22:00.136 "w_mbytes_per_sec": 0 00:22:00.136 }, 00:22:00.136 "claimed": true, 00:22:00.136 "claim_type": "exclusive_write", 00:22:00.136 "zoned": false, 00:22:00.136 "supported_io_types": { 00:22:00.136 "read": true, 00:22:00.136 "write": true, 00:22:00.136 "unmap": true, 00:22:00.136 "flush": true, 00:22:00.136 "reset": true, 00:22:00.136 "nvme_admin": false, 00:22:00.136 "nvme_io": false, 00:22:00.136 "nvme_io_md": false, 00:22:00.136 "write_zeroes": true, 00:22:00.136 "zcopy": true, 00:22:00.136 "get_zone_info": false, 00:22:00.136 "zone_management": false, 00:22:00.136 "zone_append": false, 00:22:00.136 "compare": false, 00:22:00.136 "compare_and_write": false, 00:22:00.136 "abort": true, 00:22:00.136 "seek_hole": false, 00:22:00.136 "seek_data": false, 00:22:00.136 "copy": true, 00:22:00.136 "nvme_iov_md": false 00:22:00.136 }, 00:22:00.136 "memory_domains": [ 00:22:00.136 { 00:22:00.136 "dma_device_id": "system", 00:22:00.136 "dma_device_type": 1 00:22:00.136 }, 00:22:00.136 { 00:22:00.136 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:00.136 "dma_device_type": 2 00:22:00.136 } 00:22:00.136 ], 00:22:00.136 "driver_specific": {} 00:22:00.136 }' 00:22:00.136 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:00.394 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:00.394 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:00.394 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:00.394 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:00.394 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:00.394 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:00.394 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:00.394 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:00.394 12:03:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:00.653 12:03:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:00.653 12:03:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:00.653 12:03:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:00.911 [2024-07-15 12:03:14.275206] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:00.911 [2024-07-15 12:03:14.275238] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:00.911 [2024-07-15 12:03:14.275296] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:00.911 [2024-07-15 12:03:14.275361] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:00.911 [2024-07-15 12:03:14.275374] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x248b120 name Existed_Raid, state offline 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1536978 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1536978 ']' 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1536978 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1536978 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1536978' 00:22:00.911 killing process with pid 1536978 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1536978 00:22:00.911 [2024-07-15 12:03:14.339986] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:00.911 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1536978 00:22:00.911 [2024-07-15 12:03:14.382498] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:01.168 12:03:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:22:01.168 00:22:01.168 real 0m35.300s 00:22:01.168 user 1m4.975s 00:22:01.168 sys 0m6.168s 00:22:01.168 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:01.168 12:03:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:01.168 ************************************ 00:22:01.168 END TEST raid_state_function_test_sb 00:22:01.168 ************************************ 00:22:01.168 12:03:14 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:01.168 12:03:14 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:22:01.168 12:03:14 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:22:01.168 12:03:14 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:01.168 12:03:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:01.168 ************************************ 00:22:01.168 START TEST raid_superblock_test 00:22:01.169 ************************************ 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test concat 4 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1542717 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1542717 /var/tmp/spdk-raid.sock 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1542717 ']' 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:01.169 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:01.169 12:03:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.169 [2024-07-15 12:03:14.752239] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:22:01.169 [2024-07-15 12:03:14.752304] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1542717 ] 00:22:01.426 [2024-07-15 12:03:14.871855] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:01.426 [2024-07-15 12:03:14.974693] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:01.684 [2024-07-15 12:03:15.041323] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:01.685 [2024-07-15 12:03:15.041361] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:02.250 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:22:02.508 malloc1 00:22:02.508 12:03:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:02.766 [2024-07-15 12:03:16.146024] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:02.766 [2024-07-15 12:03:16.146072] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.766 [2024-07-15 12:03:16.146095] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10fb560 00:22:02.766 [2024-07-15 12:03:16.146108] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.766 [2024-07-15 12:03:16.147748] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.766 [2024-07-15 12:03:16.147775] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:02.766 pt1 00:22:02.766 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:02.766 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:02.766 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:22:02.766 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:22:02.766 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:22:02.766 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:02.766 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:02.766 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:02.766 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:22:03.024 malloc2 00:22:03.024 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:03.283 [2024-07-15 12:03:16.641240] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:03.283 [2024-07-15 12:03:16.641286] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.283 [2024-07-15 12:03:16.641302] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11995b0 00:22:03.283 [2024-07-15 12:03:16.641320] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.283 [2024-07-15 12:03:16.642889] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.283 [2024-07-15 12:03:16.642917] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:03.283 pt2 00:22:03.283 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:03.283 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:03.283 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:22:03.283 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:22:03.283 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:22:03.283 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:03.283 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:03.283 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:03.283 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:22:03.542 malloc3 00:22:03.542 12:03:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:03.542 [2024-07-15 12:03:17.135111] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:03.542 [2024-07-15 12:03:17.135158] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.542 [2024-07-15 12:03:17.135176] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1199be0 00:22:03.542 [2024-07-15 12:03:17.135188] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.542 [2024-07-15 12:03:17.136713] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.542 [2024-07-15 12:03:17.136741] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:03.801 pt3 00:22:03.801 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:03.801 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:03.801 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:22:03.801 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:22:03.801 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:22:03.801 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:22:03.801 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:22:03.801 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:22:03.801 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:22:03.801 malloc4 00:22:04.060 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:04.060 [2024-07-15 12:03:17.557294] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:04.060 [2024-07-15 12:03:17.557335] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:04.060 [2024-07-15 12:03:17.557352] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x119cf00 00:22:04.060 [2024-07-15 12:03:17.557364] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:04.060 [2024-07-15 12:03:17.558755] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:04.060 [2024-07-15 12:03:17.558782] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:04.060 pt4 00:22:04.060 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:22:04.060 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:22:04.060 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:22:04.320 [2024-07-15 12:03:17.809983] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:04.320 [2024-07-15 12:03:17.811319] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:04.320 [2024-07-15 12:03:17.811375] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:04.320 [2024-07-15 12:03:17.811418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:04.320 [2024-07-15 12:03:17.811584] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x119c880 00:22:04.320 [2024-07-15 12:03:17.811595] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:04.320 [2024-07-15 12:03:17.811808] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10fdf60 00:22:04.320 [2024-07-15 12:03:17.811957] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x119c880 00:22:04.320 [2024-07-15 12:03:17.811967] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x119c880 00:22:04.320 [2024-07-15 12:03:17.812063] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:04.320 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.321 12:03:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.580 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.580 "name": "raid_bdev1", 00:22:04.580 "uuid": "c60557bb-d775-4062-a8da-73a3340a241e", 00:22:04.580 "strip_size_kb": 64, 00:22:04.580 "state": "online", 00:22:04.580 "raid_level": "concat", 00:22:04.580 "superblock": true, 00:22:04.580 "num_base_bdevs": 4, 00:22:04.580 "num_base_bdevs_discovered": 4, 00:22:04.580 "num_base_bdevs_operational": 4, 00:22:04.580 "base_bdevs_list": [ 00:22:04.580 { 00:22:04.580 "name": "pt1", 00:22:04.580 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:04.580 "is_configured": true, 00:22:04.580 "data_offset": 2048, 00:22:04.580 "data_size": 63488 00:22:04.580 }, 00:22:04.580 { 00:22:04.580 "name": "pt2", 00:22:04.580 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:04.580 "is_configured": true, 00:22:04.580 "data_offset": 2048, 00:22:04.580 "data_size": 63488 00:22:04.580 }, 00:22:04.580 { 00:22:04.580 "name": "pt3", 00:22:04.580 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:04.580 "is_configured": true, 00:22:04.580 "data_offset": 2048, 00:22:04.580 "data_size": 63488 00:22:04.580 }, 00:22:04.580 { 00:22:04.580 "name": "pt4", 00:22:04.580 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:04.580 "is_configured": true, 00:22:04.580 "data_offset": 2048, 00:22:04.580 "data_size": 63488 00:22:04.580 } 00:22:04.580 ] 00:22:04.580 }' 00:22:04.580 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.580 12:03:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.148 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:22:05.148 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:05.148 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:05.148 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:05.148 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:05.148 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:05.149 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:05.149 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:05.407 [2024-07-15 12:03:18.889123] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:05.407 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:05.407 "name": "raid_bdev1", 00:22:05.407 "aliases": [ 00:22:05.407 "c60557bb-d775-4062-a8da-73a3340a241e" 00:22:05.407 ], 00:22:05.407 "product_name": "Raid Volume", 00:22:05.407 "block_size": 512, 00:22:05.407 "num_blocks": 253952, 00:22:05.407 "uuid": "c60557bb-d775-4062-a8da-73a3340a241e", 00:22:05.407 "assigned_rate_limits": { 00:22:05.407 "rw_ios_per_sec": 0, 00:22:05.407 "rw_mbytes_per_sec": 0, 00:22:05.407 "r_mbytes_per_sec": 0, 00:22:05.407 "w_mbytes_per_sec": 0 00:22:05.407 }, 00:22:05.407 "claimed": false, 00:22:05.407 "zoned": false, 00:22:05.407 "supported_io_types": { 00:22:05.407 "read": true, 00:22:05.407 "write": true, 00:22:05.407 "unmap": true, 00:22:05.407 "flush": true, 00:22:05.407 "reset": true, 00:22:05.407 "nvme_admin": false, 00:22:05.407 "nvme_io": false, 00:22:05.407 "nvme_io_md": false, 00:22:05.407 "write_zeroes": true, 00:22:05.407 "zcopy": false, 00:22:05.407 "get_zone_info": false, 00:22:05.407 "zone_management": false, 00:22:05.407 "zone_append": false, 00:22:05.407 "compare": false, 00:22:05.407 "compare_and_write": false, 00:22:05.407 "abort": false, 00:22:05.407 "seek_hole": false, 00:22:05.407 "seek_data": false, 00:22:05.407 "copy": false, 00:22:05.407 "nvme_iov_md": false 00:22:05.407 }, 00:22:05.407 "memory_domains": [ 00:22:05.407 { 00:22:05.407 "dma_device_id": "system", 00:22:05.407 "dma_device_type": 1 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.407 "dma_device_type": 2 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "dma_device_id": "system", 00:22:05.407 "dma_device_type": 1 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.407 "dma_device_type": 2 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "dma_device_id": "system", 00:22:05.407 "dma_device_type": 1 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.407 "dma_device_type": 2 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "dma_device_id": "system", 00:22:05.407 "dma_device_type": 1 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.407 "dma_device_type": 2 00:22:05.407 } 00:22:05.407 ], 00:22:05.407 "driver_specific": { 00:22:05.407 "raid": { 00:22:05.407 "uuid": "c60557bb-d775-4062-a8da-73a3340a241e", 00:22:05.407 "strip_size_kb": 64, 00:22:05.407 "state": "online", 00:22:05.407 "raid_level": "concat", 00:22:05.407 "superblock": true, 00:22:05.407 "num_base_bdevs": 4, 00:22:05.407 "num_base_bdevs_discovered": 4, 00:22:05.407 "num_base_bdevs_operational": 4, 00:22:05.407 "base_bdevs_list": [ 00:22:05.407 { 00:22:05.407 "name": "pt1", 00:22:05.407 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:05.407 "is_configured": true, 00:22:05.407 "data_offset": 2048, 00:22:05.407 "data_size": 63488 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "name": "pt2", 00:22:05.407 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:05.407 "is_configured": true, 00:22:05.407 "data_offset": 2048, 00:22:05.407 "data_size": 63488 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "name": "pt3", 00:22:05.407 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:05.407 "is_configured": true, 00:22:05.407 "data_offset": 2048, 00:22:05.407 "data_size": 63488 00:22:05.407 }, 00:22:05.407 { 00:22:05.407 "name": "pt4", 00:22:05.407 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:05.407 "is_configured": true, 00:22:05.407 "data_offset": 2048, 00:22:05.407 "data_size": 63488 00:22:05.407 } 00:22:05.407 ] 00:22:05.408 } 00:22:05.408 } 00:22:05.408 }' 00:22:05.408 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:05.408 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:05.408 pt2 00:22:05.408 pt3 00:22:05.408 pt4' 00:22:05.408 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:05.408 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:05.408 12:03:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:05.667 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:05.667 "name": "pt1", 00:22:05.667 "aliases": [ 00:22:05.667 "00000000-0000-0000-0000-000000000001" 00:22:05.667 ], 00:22:05.667 "product_name": "passthru", 00:22:05.667 "block_size": 512, 00:22:05.667 "num_blocks": 65536, 00:22:05.667 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:05.667 "assigned_rate_limits": { 00:22:05.667 "rw_ios_per_sec": 0, 00:22:05.667 "rw_mbytes_per_sec": 0, 00:22:05.667 "r_mbytes_per_sec": 0, 00:22:05.667 "w_mbytes_per_sec": 0 00:22:05.667 }, 00:22:05.667 "claimed": true, 00:22:05.667 "claim_type": "exclusive_write", 00:22:05.667 "zoned": false, 00:22:05.667 "supported_io_types": { 00:22:05.667 "read": true, 00:22:05.667 "write": true, 00:22:05.667 "unmap": true, 00:22:05.667 "flush": true, 00:22:05.667 "reset": true, 00:22:05.667 "nvme_admin": false, 00:22:05.667 "nvme_io": false, 00:22:05.667 "nvme_io_md": false, 00:22:05.667 "write_zeroes": true, 00:22:05.667 "zcopy": true, 00:22:05.667 "get_zone_info": false, 00:22:05.667 "zone_management": false, 00:22:05.667 "zone_append": false, 00:22:05.667 "compare": false, 00:22:05.667 "compare_and_write": false, 00:22:05.667 "abort": true, 00:22:05.667 "seek_hole": false, 00:22:05.667 "seek_data": false, 00:22:05.667 "copy": true, 00:22:05.667 "nvme_iov_md": false 00:22:05.667 }, 00:22:05.667 "memory_domains": [ 00:22:05.667 { 00:22:05.667 "dma_device_id": "system", 00:22:05.667 "dma_device_type": 1 00:22:05.667 }, 00:22:05.667 { 00:22:05.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:05.667 "dma_device_type": 2 00:22:05.667 } 00:22:05.667 ], 00:22:05.667 "driver_specific": { 00:22:05.667 "passthru": { 00:22:05.667 "name": "pt1", 00:22:05.667 "base_bdev_name": "malloc1" 00:22:05.667 } 00:22:05.667 } 00:22:05.667 }' 00:22:05.667 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.667 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:05.926 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:05.926 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.926 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:05.926 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:05.926 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.926 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:05.926 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:05.926 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.185 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.185 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:06.185 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:06.185 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:06.185 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:06.445 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:06.445 "name": "pt2", 00:22:06.445 "aliases": [ 00:22:06.445 "00000000-0000-0000-0000-000000000002" 00:22:06.445 ], 00:22:06.445 "product_name": "passthru", 00:22:06.445 "block_size": 512, 00:22:06.445 "num_blocks": 65536, 00:22:06.445 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:06.445 "assigned_rate_limits": { 00:22:06.445 "rw_ios_per_sec": 0, 00:22:06.445 "rw_mbytes_per_sec": 0, 00:22:06.445 "r_mbytes_per_sec": 0, 00:22:06.445 "w_mbytes_per_sec": 0 00:22:06.445 }, 00:22:06.445 "claimed": true, 00:22:06.445 "claim_type": "exclusive_write", 00:22:06.445 "zoned": false, 00:22:06.445 "supported_io_types": { 00:22:06.445 "read": true, 00:22:06.445 "write": true, 00:22:06.445 "unmap": true, 00:22:06.445 "flush": true, 00:22:06.445 "reset": true, 00:22:06.445 "nvme_admin": false, 00:22:06.445 "nvme_io": false, 00:22:06.445 "nvme_io_md": false, 00:22:06.445 "write_zeroes": true, 00:22:06.445 "zcopy": true, 00:22:06.445 "get_zone_info": false, 00:22:06.445 "zone_management": false, 00:22:06.445 "zone_append": false, 00:22:06.445 "compare": false, 00:22:06.445 "compare_and_write": false, 00:22:06.445 "abort": true, 00:22:06.445 "seek_hole": false, 00:22:06.445 "seek_data": false, 00:22:06.445 "copy": true, 00:22:06.445 "nvme_iov_md": false 00:22:06.445 }, 00:22:06.445 "memory_domains": [ 00:22:06.445 { 00:22:06.445 "dma_device_id": "system", 00:22:06.445 "dma_device_type": 1 00:22:06.445 }, 00:22:06.445 { 00:22:06.445 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.445 "dma_device_type": 2 00:22:06.445 } 00:22:06.445 ], 00:22:06.445 "driver_specific": { 00:22:06.445 "passthru": { 00:22:06.445 "name": "pt2", 00:22:06.445 "base_bdev_name": "malloc2" 00:22:06.445 } 00:22:06.445 } 00:22:06.445 }' 00:22:06.445 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.445 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.445 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:06.445 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.445 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.445 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:06.445 12:03:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:06.705 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:06.705 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:06.705 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.705 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:06.705 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:06.705 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:06.705 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:06.705 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:06.963 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:06.964 "name": "pt3", 00:22:06.964 "aliases": [ 00:22:06.964 "00000000-0000-0000-0000-000000000003" 00:22:06.964 ], 00:22:06.964 "product_name": "passthru", 00:22:06.964 "block_size": 512, 00:22:06.964 "num_blocks": 65536, 00:22:06.964 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:06.964 "assigned_rate_limits": { 00:22:06.964 "rw_ios_per_sec": 0, 00:22:06.964 "rw_mbytes_per_sec": 0, 00:22:06.964 "r_mbytes_per_sec": 0, 00:22:06.964 "w_mbytes_per_sec": 0 00:22:06.964 }, 00:22:06.964 "claimed": true, 00:22:06.964 "claim_type": "exclusive_write", 00:22:06.964 "zoned": false, 00:22:06.964 "supported_io_types": { 00:22:06.964 "read": true, 00:22:06.964 "write": true, 00:22:06.964 "unmap": true, 00:22:06.964 "flush": true, 00:22:06.964 "reset": true, 00:22:06.964 "nvme_admin": false, 00:22:06.964 "nvme_io": false, 00:22:06.964 "nvme_io_md": false, 00:22:06.964 "write_zeroes": true, 00:22:06.964 "zcopy": true, 00:22:06.964 "get_zone_info": false, 00:22:06.964 "zone_management": false, 00:22:06.964 "zone_append": false, 00:22:06.964 "compare": false, 00:22:06.964 "compare_and_write": false, 00:22:06.964 "abort": true, 00:22:06.964 "seek_hole": false, 00:22:06.964 "seek_data": false, 00:22:06.964 "copy": true, 00:22:06.964 "nvme_iov_md": false 00:22:06.964 }, 00:22:06.964 "memory_domains": [ 00:22:06.964 { 00:22:06.964 "dma_device_id": "system", 00:22:06.964 "dma_device_type": 1 00:22:06.964 }, 00:22:06.964 { 00:22:06.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:06.964 "dma_device_type": 2 00:22:06.964 } 00:22:06.964 ], 00:22:06.964 "driver_specific": { 00:22:06.964 "passthru": { 00:22:06.964 "name": "pt3", 00:22:06.964 "base_bdev_name": "malloc3" 00:22:06.964 } 00:22:06.964 } 00:22:06.964 }' 00:22:06.964 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.964 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:06.964 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:06.964 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:06.964 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:07.223 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:07.487 12:03:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:07.487 "name": "pt4", 00:22:07.487 "aliases": [ 00:22:07.487 "00000000-0000-0000-0000-000000000004" 00:22:07.487 ], 00:22:07.487 "product_name": "passthru", 00:22:07.487 "block_size": 512, 00:22:07.487 "num_blocks": 65536, 00:22:07.487 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:07.487 "assigned_rate_limits": { 00:22:07.487 "rw_ios_per_sec": 0, 00:22:07.487 "rw_mbytes_per_sec": 0, 00:22:07.487 "r_mbytes_per_sec": 0, 00:22:07.487 "w_mbytes_per_sec": 0 00:22:07.487 }, 00:22:07.487 "claimed": true, 00:22:07.487 "claim_type": "exclusive_write", 00:22:07.487 "zoned": false, 00:22:07.487 "supported_io_types": { 00:22:07.487 "read": true, 00:22:07.487 "write": true, 00:22:07.487 "unmap": true, 00:22:07.487 "flush": true, 00:22:07.487 "reset": true, 00:22:07.487 "nvme_admin": false, 00:22:07.487 "nvme_io": false, 00:22:07.487 "nvme_io_md": false, 00:22:07.487 "write_zeroes": true, 00:22:07.487 "zcopy": true, 00:22:07.487 "get_zone_info": false, 00:22:07.487 "zone_management": false, 00:22:07.487 "zone_append": false, 00:22:07.487 "compare": false, 00:22:07.487 "compare_and_write": false, 00:22:07.487 "abort": true, 00:22:07.487 "seek_hole": false, 00:22:07.487 "seek_data": false, 00:22:07.487 "copy": true, 00:22:07.487 "nvme_iov_md": false 00:22:07.487 }, 00:22:07.487 "memory_domains": [ 00:22:07.487 { 00:22:07.487 "dma_device_id": "system", 00:22:07.487 "dma_device_type": 1 00:22:07.487 }, 00:22:07.487 { 00:22:07.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:07.487 "dma_device_type": 2 00:22:07.487 } 00:22:07.487 ], 00:22:07.487 "driver_specific": { 00:22:07.487 "passthru": { 00:22:07.487 "name": "pt4", 00:22:07.487 "base_bdev_name": "malloc4" 00:22:07.487 } 00:22:07.487 } 00:22:07.487 }' 00:22:07.487 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:07.487 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:07.745 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:07.746 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:07.746 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:07.746 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:07.746 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:07.746 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:07.746 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:07.746 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:07.746 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:07.746 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:08.004 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:08.004 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:22:08.004 [2024-07-15 12:03:21.568205] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:08.004 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c60557bb-d775-4062-a8da-73a3340a241e 00:22:08.004 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z c60557bb-d775-4062-a8da-73a3340a241e ']' 00:22:08.004 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:08.262 [2024-07-15 12:03:21.820569] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:08.262 [2024-07-15 12:03:21.820591] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:08.262 [2024-07-15 12:03:21.820642] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:08.262 [2024-07-15 12:03:21.820711] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:08.262 [2024-07-15 12:03:21.820723] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x119c880 name raid_bdev1, state offline 00:22:08.262 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.262 12:03:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:22:08.521 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:22:08.521 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:22:08.521 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:08.521 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:22:08.780 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:08.780 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:09.040 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:09.040 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:22:09.299 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:22:09.300 12:03:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:22:09.559 12:03:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:22:09.559 12:03:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:09.826 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:22:10.095 [2024-07-15 12:03:23.593172] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:22:10.096 [2024-07-15 12:03:23.594499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:22:10.096 [2024-07-15 12:03:23.594543] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:22:10.096 [2024-07-15 12:03:23.594583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:22:10.096 [2024-07-15 12:03:23.594628] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:22:10.096 [2024-07-15 12:03:23.594665] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:22:10.096 [2024-07-15 12:03:23.594697] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:22:10.096 [2024-07-15 12:03:23.594720] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:22:10.096 [2024-07-15 12:03:23.594738] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:10.096 [2024-07-15 12:03:23.594748] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x119dbf0 name raid_bdev1, state configuring 00:22:10.096 request: 00:22:10.096 { 00:22:10.096 "name": "raid_bdev1", 00:22:10.096 "raid_level": "concat", 00:22:10.096 "base_bdevs": [ 00:22:10.096 "malloc1", 00:22:10.096 "malloc2", 00:22:10.096 "malloc3", 00:22:10.096 "malloc4" 00:22:10.096 ], 00:22:10.096 "strip_size_kb": 64, 00:22:10.096 "superblock": false, 00:22:10.096 "method": "bdev_raid_create", 00:22:10.096 "req_id": 1 00:22:10.096 } 00:22:10.096 Got JSON-RPC error response 00:22:10.096 response: 00:22:10.096 { 00:22:10.096 "code": -17, 00:22:10.096 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:22:10.096 } 00:22:10.096 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:22:10.096 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:22:10.096 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:22:10.096 12:03:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:22:10.096 12:03:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.096 12:03:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:22:10.701 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:22:10.701 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:22:10.701 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:22:10.960 [2024-07-15 12:03:24.355108] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:22:10.960 [2024-07-15 12:03:24.355155] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:10.960 [2024-07-15 12:03:24.355173] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x119b1f0 00:22:10.960 [2024-07-15 12:03:24.355185] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:10.960 [2024-07-15 12:03:24.356810] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:10.960 [2024-07-15 12:03:24.356840] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:22:10.960 [2024-07-15 12:03:24.356902] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:22:10.960 [2024-07-15 12:03:24.356928] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:22:10.960 pt1 00:22:10.960 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:22:10.960 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:10.960 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:10.961 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:10.961 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:10.961 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:10.961 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:10.961 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:10.961 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:10.961 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:10.961 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.961 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.220 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:11.220 "name": "raid_bdev1", 00:22:11.220 "uuid": "c60557bb-d775-4062-a8da-73a3340a241e", 00:22:11.220 "strip_size_kb": 64, 00:22:11.220 "state": "configuring", 00:22:11.220 "raid_level": "concat", 00:22:11.220 "superblock": true, 00:22:11.220 "num_base_bdevs": 4, 00:22:11.220 "num_base_bdevs_discovered": 1, 00:22:11.220 "num_base_bdevs_operational": 4, 00:22:11.220 "base_bdevs_list": [ 00:22:11.220 { 00:22:11.220 "name": "pt1", 00:22:11.220 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:11.220 "is_configured": true, 00:22:11.220 "data_offset": 2048, 00:22:11.220 "data_size": 63488 00:22:11.220 }, 00:22:11.220 { 00:22:11.220 "name": null, 00:22:11.220 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:11.220 "is_configured": false, 00:22:11.220 "data_offset": 2048, 00:22:11.220 "data_size": 63488 00:22:11.220 }, 00:22:11.220 { 00:22:11.220 "name": null, 00:22:11.220 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:11.220 "is_configured": false, 00:22:11.220 "data_offset": 2048, 00:22:11.220 "data_size": 63488 00:22:11.220 }, 00:22:11.220 { 00:22:11.220 "name": null, 00:22:11.220 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:11.220 "is_configured": false, 00:22:11.220 "data_offset": 2048, 00:22:11.220 "data_size": 63488 00:22:11.220 } 00:22:11.220 ] 00:22:11.220 }' 00:22:11.220 12:03:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:11.220 12:03:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:11.788 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:22:11.788 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:12.048 [2024-07-15 12:03:25.433996] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:12.048 [2024-07-15 12:03:25.434053] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:12.048 [2024-07-15 12:03:25.434072] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x119e920 00:22:12.048 [2024-07-15 12:03:25.434085] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:12.048 [2024-07-15 12:03:25.434424] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:12.048 [2024-07-15 12:03:25.434442] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:12.048 [2024-07-15 12:03:25.434502] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:12.048 [2024-07-15 12:03:25.434520] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:12.048 pt2 00:22:12.048 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:22:12.307 [2024-07-15 12:03:25.682661] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:12.307 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:12.566 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:12.566 "name": "raid_bdev1", 00:22:12.566 "uuid": "c60557bb-d775-4062-a8da-73a3340a241e", 00:22:12.566 "strip_size_kb": 64, 00:22:12.566 "state": "configuring", 00:22:12.566 "raid_level": "concat", 00:22:12.566 "superblock": true, 00:22:12.566 "num_base_bdevs": 4, 00:22:12.566 "num_base_bdevs_discovered": 1, 00:22:12.566 "num_base_bdevs_operational": 4, 00:22:12.566 "base_bdevs_list": [ 00:22:12.566 { 00:22:12.566 "name": "pt1", 00:22:12.566 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:12.566 "is_configured": true, 00:22:12.566 "data_offset": 2048, 00:22:12.566 "data_size": 63488 00:22:12.566 }, 00:22:12.566 { 00:22:12.566 "name": null, 00:22:12.566 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:12.566 "is_configured": false, 00:22:12.566 "data_offset": 2048, 00:22:12.566 "data_size": 63488 00:22:12.566 }, 00:22:12.566 { 00:22:12.566 "name": null, 00:22:12.566 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:12.566 "is_configured": false, 00:22:12.566 "data_offset": 2048, 00:22:12.566 "data_size": 63488 00:22:12.566 }, 00:22:12.566 { 00:22:12.566 "name": null, 00:22:12.566 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:12.566 "is_configured": false, 00:22:12.566 "data_offset": 2048, 00:22:12.566 "data_size": 63488 00:22:12.566 } 00:22:12.566 ] 00:22:12.566 }' 00:22:12.567 12:03:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:12.567 12:03:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:13.135 12:03:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:22:13.135 12:03:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:13.135 12:03:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:22:13.394 [2024-07-15 12:03:26.757500] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:22:13.394 [2024-07-15 12:03:26.757552] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:13.394 [2024-07-15 12:03:26.757571] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x119eb50 00:22:13.394 [2024-07-15 12:03:26.757583] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:13.394 [2024-07-15 12:03:26.757929] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:13.394 [2024-07-15 12:03:26.757947] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:22:13.394 [2024-07-15 12:03:26.758005] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:22:13.394 [2024-07-15 12:03:26.758024] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:22:13.394 pt2 00:22:13.394 12:03:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:13.394 12:03:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:13.394 12:03:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:22:13.653 [2024-07-15 12:03:27.006163] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:22:13.653 [2024-07-15 12:03:27.006207] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:13.653 [2024-07-15 12:03:27.006225] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11a6f60 00:22:13.653 [2024-07-15 12:03:27.006237] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:13.653 [2024-07-15 12:03:27.006540] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:13.653 [2024-07-15 12:03:27.006557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:22:13.653 [2024-07-15 12:03:27.006610] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:22:13.653 [2024-07-15 12:03:27.006627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:22:13.653 pt3 00:22:13.653 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:13.653 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:13.653 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:22:13.913 [2024-07-15 12:03:27.250814] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:22:13.913 [2024-07-15 12:03:27.250853] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:13.913 [2024-07-15 12:03:27.250870] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10fe0b0 00:22:13.913 [2024-07-15 12:03:27.250881] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:13.913 [2024-07-15 12:03:27.251183] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:13.913 [2024-07-15 12:03:27.251199] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:22:13.913 [2024-07-15 12:03:27.251253] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:22:13.913 [2024-07-15 12:03:27.251270] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:22:13.913 [2024-07-15 12:03:27.251385] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x10f25c0 00:22:13.913 [2024-07-15 12:03:27.251395] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:13.913 [2024-07-15 12:03:27.251566] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10fdee0 00:22:13.913 [2024-07-15 12:03:27.251704] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10f25c0 00:22:13.913 [2024-07-15 12:03:27.251719] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10f25c0 00:22:13.913 [2024-07-15 12:03:27.251826] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:13.913 pt4 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.913 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.174 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.174 "name": "raid_bdev1", 00:22:14.174 "uuid": "c60557bb-d775-4062-a8da-73a3340a241e", 00:22:14.174 "strip_size_kb": 64, 00:22:14.174 "state": "online", 00:22:14.174 "raid_level": "concat", 00:22:14.175 "superblock": true, 00:22:14.175 "num_base_bdevs": 4, 00:22:14.175 "num_base_bdevs_discovered": 4, 00:22:14.175 "num_base_bdevs_operational": 4, 00:22:14.175 "base_bdevs_list": [ 00:22:14.175 { 00:22:14.175 "name": "pt1", 00:22:14.175 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:14.175 "is_configured": true, 00:22:14.175 "data_offset": 2048, 00:22:14.175 "data_size": 63488 00:22:14.175 }, 00:22:14.175 { 00:22:14.175 "name": "pt2", 00:22:14.175 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:14.175 "is_configured": true, 00:22:14.175 "data_offset": 2048, 00:22:14.175 "data_size": 63488 00:22:14.175 }, 00:22:14.175 { 00:22:14.175 "name": "pt3", 00:22:14.175 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:14.175 "is_configured": true, 00:22:14.175 "data_offset": 2048, 00:22:14.175 "data_size": 63488 00:22:14.175 }, 00:22:14.175 { 00:22:14.175 "name": "pt4", 00:22:14.175 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:14.175 "is_configured": true, 00:22:14.175 "data_offset": 2048, 00:22:14.175 "data_size": 63488 00:22:14.175 } 00:22:14.175 ] 00:22:14.175 }' 00:22:14.175 12:03:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.175 12:03:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:14.803 [2024-07-15 12:03:28.346048] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:14.803 "name": "raid_bdev1", 00:22:14.803 "aliases": [ 00:22:14.803 "c60557bb-d775-4062-a8da-73a3340a241e" 00:22:14.803 ], 00:22:14.803 "product_name": "Raid Volume", 00:22:14.803 "block_size": 512, 00:22:14.803 "num_blocks": 253952, 00:22:14.803 "uuid": "c60557bb-d775-4062-a8da-73a3340a241e", 00:22:14.803 "assigned_rate_limits": { 00:22:14.803 "rw_ios_per_sec": 0, 00:22:14.803 "rw_mbytes_per_sec": 0, 00:22:14.803 "r_mbytes_per_sec": 0, 00:22:14.803 "w_mbytes_per_sec": 0 00:22:14.803 }, 00:22:14.803 "claimed": false, 00:22:14.803 "zoned": false, 00:22:14.803 "supported_io_types": { 00:22:14.803 "read": true, 00:22:14.803 "write": true, 00:22:14.803 "unmap": true, 00:22:14.803 "flush": true, 00:22:14.803 "reset": true, 00:22:14.803 "nvme_admin": false, 00:22:14.803 "nvme_io": false, 00:22:14.803 "nvme_io_md": false, 00:22:14.803 "write_zeroes": true, 00:22:14.803 "zcopy": false, 00:22:14.803 "get_zone_info": false, 00:22:14.803 "zone_management": false, 00:22:14.803 "zone_append": false, 00:22:14.803 "compare": false, 00:22:14.803 "compare_and_write": false, 00:22:14.803 "abort": false, 00:22:14.803 "seek_hole": false, 00:22:14.803 "seek_data": false, 00:22:14.803 "copy": false, 00:22:14.803 "nvme_iov_md": false 00:22:14.803 }, 00:22:14.803 "memory_domains": [ 00:22:14.803 { 00:22:14.803 "dma_device_id": "system", 00:22:14.803 "dma_device_type": 1 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.803 "dma_device_type": 2 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "dma_device_id": "system", 00:22:14.803 "dma_device_type": 1 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.803 "dma_device_type": 2 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "dma_device_id": "system", 00:22:14.803 "dma_device_type": 1 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.803 "dma_device_type": 2 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "dma_device_id": "system", 00:22:14.803 "dma_device_type": 1 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:14.803 "dma_device_type": 2 00:22:14.803 } 00:22:14.803 ], 00:22:14.803 "driver_specific": { 00:22:14.803 "raid": { 00:22:14.803 "uuid": "c60557bb-d775-4062-a8da-73a3340a241e", 00:22:14.803 "strip_size_kb": 64, 00:22:14.803 "state": "online", 00:22:14.803 "raid_level": "concat", 00:22:14.803 "superblock": true, 00:22:14.803 "num_base_bdevs": 4, 00:22:14.803 "num_base_bdevs_discovered": 4, 00:22:14.803 "num_base_bdevs_operational": 4, 00:22:14.803 "base_bdevs_list": [ 00:22:14.803 { 00:22:14.803 "name": "pt1", 00:22:14.803 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:14.803 "is_configured": true, 00:22:14.803 "data_offset": 2048, 00:22:14.803 "data_size": 63488 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "name": "pt2", 00:22:14.803 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:14.803 "is_configured": true, 00:22:14.803 "data_offset": 2048, 00:22:14.803 "data_size": 63488 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "name": "pt3", 00:22:14.803 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:14.803 "is_configured": true, 00:22:14.803 "data_offset": 2048, 00:22:14.803 "data_size": 63488 00:22:14.803 }, 00:22:14.803 { 00:22:14.803 "name": "pt4", 00:22:14.803 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:14.803 "is_configured": true, 00:22:14.803 "data_offset": 2048, 00:22:14.803 "data_size": 63488 00:22:14.803 } 00:22:14.803 ] 00:22:14.803 } 00:22:14.803 } 00:22:14.803 }' 00:22:14.803 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:15.063 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:22:15.063 pt2 00:22:15.063 pt3 00:22:15.063 pt4' 00:22:15.063 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:15.063 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:22:15.063 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:15.063 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:15.063 "name": "pt1", 00:22:15.063 "aliases": [ 00:22:15.063 "00000000-0000-0000-0000-000000000001" 00:22:15.063 ], 00:22:15.063 "product_name": "passthru", 00:22:15.063 "block_size": 512, 00:22:15.063 "num_blocks": 65536, 00:22:15.063 "uuid": "00000000-0000-0000-0000-000000000001", 00:22:15.063 "assigned_rate_limits": { 00:22:15.063 "rw_ios_per_sec": 0, 00:22:15.063 "rw_mbytes_per_sec": 0, 00:22:15.063 "r_mbytes_per_sec": 0, 00:22:15.063 "w_mbytes_per_sec": 0 00:22:15.063 }, 00:22:15.063 "claimed": true, 00:22:15.063 "claim_type": "exclusive_write", 00:22:15.063 "zoned": false, 00:22:15.063 "supported_io_types": { 00:22:15.063 "read": true, 00:22:15.063 "write": true, 00:22:15.063 "unmap": true, 00:22:15.063 "flush": true, 00:22:15.063 "reset": true, 00:22:15.063 "nvme_admin": false, 00:22:15.063 "nvme_io": false, 00:22:15.063 "nvme_io_md": false, 00:22:15.063 "write_zeroes": true, 00:22:15.063 "zcopy": true, 00:22:15.063 "get_zone_info": false, 00:22:15.063 "zone_management": false, 00:22:15.063 "zone_append": false, 00:22:15.063 "compare": false, 00:22:15.063 "compare_and_write": false, 00:22:15.063 "abort": true, 00:22:15.063 "seek_hole": false, 00:22:15.063 "seek_data": false, 00:22:15.063 "copy": true, 00:22:15.063 "nvme_iov_md": false 00:22:15.063 }, 00:22:15.063 "memory_domains": [ 00:22:15.063 { 00:22:15.063 "dma_device_id": "system", 00:22:15.063 "dma_device_type": 1 00:22:15.063 }, 00:22:15.063 { 00:22:15.063 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.063 "dma_device_type": 2 00:22:15.063 } 00:22:15.063 ], 00:22:15.063 "driver_specific": { 00:22:15.063 "passthru": { 00:22:15.063 "name": "pt1", 00:22:15.063 "base_bdev_name": "malloc1" 00:22:15.063 } 00:22:15.063 } 00:22:15.063 }' 00:22:15.063 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.322 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.322 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:15.322 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.322 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.322 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:15.322 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.322 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:15.322 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:15.322 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.582 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:15.582 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:15.582 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:15.582 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:22:15.582 12:03:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:15.842 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:15.842 "name": "pt2", 00:22:15.842 "aliases": [ 00:22:15.842 "00000000-0000-0000-0000-000000000002" 00:22:15.842 ], 00:22:15.842 "product_name": "passthru", 00:22:15.842 "block_size": 512, 00:22:15.842 "num_blocks": 65536, 00:22:15.842 "uuid": "00000000-0000-0000-0000-000000000002", 00:22:15.842 "assigned_rate_limits": { 00:22:15.842 "rw_ios_per_sec": 0, 00:22:15.842 "rw_mbytes_per_sec": 0, 00:22:15.842 "r_mbytes_per_sec": 0, 00:22:15.842 "w_mbytes_per_sec": 0 00:22:15.842 }, 00:22:15.842 "claimed": true, 00:22:15.842 "claim_type": "exclusive_write", 00:22:15.842 "zoned": false, 00:22:15.842 "supported_io_types": { 00:22:15.842 "read": true, 00:22:15.842 "write": true, 00:22:15.842 "unmap": true, 00:22:15.842 "flush": true, 00:22:15.842 "reset": true, 00:22:15.842 "nvme_admin": false, 00:22:15.842 "nvme_io": false, 00:22:15.842 "nvme_io_md": false, 00:22:15.842 "write_zeroes": true, 00:22:15.842 "zcopy": true, 00:22:15.842 "get_zone_info": false, 00:22:15.842 "zone_management": false, 00:22:15.842 "zone_append": false, 00:22:15.842 "compare": false, 00:22:15.842 "compare_and_write": false, 00:22:15.842 "abort": true, 00:22:15.842 "seek_hole": false, 00:22:15.842 "seek_data": false, 00:22:15.842 "copy": true, 00:22:15.842 "nvme_iov_md": false 00:22:15.842 }, 00:22:15.842 "memory_domains": [ 00:22:15.842 { 00:22:15.842 "dma_device_id": "system", 00:22:15.842 "dma_device_type": 1 00:22:15.842 }, 00:22:15.842 { 00:22:15.842 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:15.842 "dma_device_type": 2 00:22:15.842 } 00:22:15.842 ], 00:22:15.842 "driver_specific": { 00:22:15.842 "passthru": { 00:22:15.842 "name": "pt2", 00:22:15.842 "base_bdev_name": "malloc2" 00:22:15.842 } 00:22:15.842 } 00:22:15.842 }' 00:22:15.842 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.842 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:15.842 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:15.842 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.842 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:15.842 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:15.842 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:16.101 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:16.101 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:16.101 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:16.101 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:16.101 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:16.101 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:16.101 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:22:16.101 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:16.360 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:16.360 "name": "pt3", 00:22:16.360 "aliases": [ 00:22:16.360 "00000000-0000-0000-0000-000000000003" 00:22:16.360 ], 00:22:16.360 "product_name": "passthru", 00:22:16.360 "block_size": 512, 00:22:16.360 "num_blocks": 65536, 00:22:16.360 "uuid": "00000000-0000-0000-0000-000000000003", 00:22:16.360 "assigned_rate_limits": { 00:22:16.360 "rw_ios_per_sec": 0, 00:22:16.360 "rw_mbytes_per_sec": 0, 00:22:16.360 "r_mbytes_per_sec": 0, 00:22:16.360 "w_mbytes_per_sec": 0 00:22:16.360 }, 00:22:16.360 "claimed": true, 00:22:16.360 "claim_type": "exclusive_write", 00:22:16.360 "zoned": false, 00:22:16.360 "supported_io_types": { 00:22:16.360 "read": true, 00:22:16.360 "write": true, 00:22:16.360 "unmap": true, 00:22:16.360 "flush": true, 00:22:16.360 "reset": true, 00:22:16.360 "nvme_admin": false, 00:22:16.360 "nvme_io": false, 00:22:16.360 "nvme_io_md": false, 00:22:16.360 "write_zeroes": true, 00:22:16.361 "zcopy": true, 00:22:16.361 "get_zone_info": false, 00:22:16.361 "zone_management": false, 00:22:16.361 "zone_append": false, 00:22:16.361 "compare": false, 00:22:16.361 "compare_and_write": false, 00:22:16.361 "abort": true, 00:22:16.361 "seek_hole": false, 00:22:16.361 "seek_data": false, 00:22:16.361 "copy": true, 00:22:16.361 "nvme_iov_md": false 00:22:16.361 }, 00:22:16.361 "memory_domains": [ 00:22:16.361 { 00:22:16.361 "dma_device_id": "system", 00:22:16.361 "dma_device_type": 1 00:22:16.361 }, 00:22:16.361 { 00:22:16.361 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:16.361 "dma_device_type": 2 00:22:16.361 } 00:22:16.361 ], 00:22:16.361 "driver_specific": { 00:22:16.361 "passthru": { 00:22:16.361 "name": "pt3", 00:22:16.361 "base_bdev_name": "malloc3" 00:22:16.361 } 00:22:16.361 } 00:22:16.361 }' 00:22:16.361 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:16.361 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:16.361 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:16.361 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:16.634 12:03:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:16.634 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:16.634 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:16.634 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:16.634 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:16.634 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:16.634 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:16.894 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:16.894 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:16.894 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:22:16.894 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:17.153 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:17.153 "name": "pt4", 00:22:17.153 "aliases": [ 00:22:17.153 "00000000-0000-0000-0000-000000000004" 00:22:17.153 ], 00:22:17.153 "product_name": "passthru", 00:22:17.153 "block_size": 512, 00:22:17.153 "num_blocks": 65536, 00:22:17.153 "uuid": "00000000-0000-0000-0000-000000000004", 00:22:17.153 "assigned_rate_limits": { 00:22:17.153 "rw_ios_per_sec": 0, 00:22:17.153 "rw_mbytes_per_sec": 0, 00:22:17.153 "r_mbytes_per_sec": 0, 00:22:17.153 "w_mbytes_per_sec": 0 00:22:17.153 }, 00:22:17.153 "claimed": true, 00:22:17.153 "claim_type": "exclusive_write", 00:22:17.153 "zoned": false, 00:22:17.153 "supported_io_types": { 00:22:17.153 "read": true, 00:22:17.153 "write": true, 00:22:17.153 "unmap": true, 00:22:17.153 "flush": true, 00:22:17.153 "reset": true, 00:22:17.153 "nvme_admin": false, 00:22:17.153 "nvme_io": false, 00:22:17.153 "nvme_io_md": false, 00:22:17.153 "write_zeroes": true, 00:22:17.153 "zcopy": true, 00:22:17.153 "get_zone_info": false, 00:22:17.153 "zone_management": false, 00:22:17.153 "zone_append": false, 00:22:17.153 "compare": false, 00:22:17.153 "compare_and_write": false, 00:22:17.153 "abort": true, 00:22:17.153 "seek_hole": false, 00:22:17.153 "seek_data": false, 00:22:17.153 "copy": true, 00:22:17.153 "nvme_iov_md": false 00:22:17.153 }, 00:22:17.153 "memory_domains": [ 00:22:17.153 { 00:22:17.153 "dma_device_id": "system", 00:22:17.153 "dma_device_type": 1 00:22:17.153 }, 00:22:17.153 { 00:22:17.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:17.153 "dma_device_type": 2 00:22:17.153 } 00:22:17.153 ], 00:22:17.153 "driver_specific": { 00:22:17.153 "passthru": { 00:22:17.153 "name": "pt4", 00:22:17.153 "base_bdev_name": "malloc4" 00:22:17.153 } 00:22:17.153 } 00:22:17.153 }' 00:22:17.153 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:17.153 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:17.153 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:17.153 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:17.153 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:17.153 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:17.153 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:17.412 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:17.412 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:17.412 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:17.412 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:17.412 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:17.412 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:17.412 12:03:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:22:17.672 [2024-07-15 12:03:31.121407] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' c60557bb-d775-4062-a8da-73a3340a241e '!=' c60557bb-d775-4062-a8da-73a3340a241e ']' 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1542717 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1542717 ']' 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1542717 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1542717 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1542717' 00:22:17.672 killing process with pid 1542717 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1542717 00:22:17.672 [2024-07-15 12:03:31.192219] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:17.672 [2024-07-15 12:03:31.192284] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:17.672 [2024-07-15 12:03:31.192355] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:17.672 [2024-07-15 12:03:31.192367] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10f25c0 name raid_bdev1, state offline 00:22:17.672 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1542717 00:22:17.672 [2024-07-15 12:03:31.234756] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:17.932 12:03:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:22:17.932 00:22:17.932 real 0m16.768s 00:22:17.932 user 0m30.338s 00:22:17.932 sys 0m2.988s 00:22:17.932 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:17.932 12:03:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:22:17.932 ************************************ 00:22:17.932 END TEST raid_superblock_test 00:22:17.932 ************************************ 00:22:17.932 12:03:31 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:17.932 12:03:31 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:22:17.932 12:03:31 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:17.932 12:03:31 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:17.932 12:03:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:18.191 ************************************ 00:22:18.191 START TEST raid_read_error_test 00:22:18.191 ************************************ 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 read 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.OgfYjGzaM1 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1545144 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1545144 /var/tmp/spdk-raid.sock 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1545144 ']' 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:18.191 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:18.191 12:03:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:18.191 [2024-07-15 12:03:31.620312] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:22:18.191 [2024-07-15 12:03:31.620379] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1545144 ] 00:22:18.191 [2024-07-15 12:03:31.750503] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:18.450 [2024-07-15 12:03:31.857073] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:18.450 [2024-07-15 12:03:31.919164] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:18.450 [2024-07-15 12:03:31.919195] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:19.018 12:03:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:19.018 12:03:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:19.018 12:03:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:19.018 12:03:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:19.277 BaseBdev1_malloc 00:22:19.277 12:03:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:19.536 true 00:22:19.536 12:03:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:19.797 [2024-07-15 12:03:33.320178] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:19.797 [2024-07-15 12:03:33.320230] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:19.797 [2024-07-15 12:03:33.320250] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f34e0 00:22:19.797 [2024-07-15 12:03:33.320263] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:19.797 [2024-07-15 12:03:33.322072] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:19.797 [2024-07-15 12:03:33.322101] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:19.797 BaseBdev1 00:22:19.797 12:03:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:19.797 12:03:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:20.055 BaseBdev2_malloc 00:22:20.055 12:03:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:20.314 true 00:22:20.314 12:03:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:20.573 [2024-07-15 12:03:34.055175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:20.573 [2024-07-15 12:03:34.055222] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:20.573 [2024-07-15 12:03:34.055241] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9f87b0 00:22:20.573 [2024-07-15 12:03:34.055253] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:20.573 [2024-07-15 12:03:34.056815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:20.573 [2024-07-15 12:03:34.056843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:20.573 BaseBdev2 00:22:20.573 12:03:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:20.573 12:03:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:20.832 BaseBdev3_malloc 00:22:20.832 12:03:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:21.092 true 00:22:21.092 12:03:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:21.352 [2024-07-15 12:03:34.790897] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:21.352 [2024-07-15 12:03:34.790947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:21.352 [2024-07-15 12:03:34.790969] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9fa8f0 00:22:21.352 [2024-07-15 12:03:34.790981] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:21.352 [2024-07-15 12:03:34.792601] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:21.352 [2024-07-15 12:03:34.792630] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:21.352 BaseBdev3 00:22:21.352 12:03:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:21.352 12:03:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:21.611 BaseBdev4_malloc 00:22:21.611 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:21.870 true 00:22:21.870 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:22.128 [2024-07-15 12:03:35.526609] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:22.128 [2024-07-15 12:03:35.526653] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:22.128 [2024-07-15 12:03:35.526672] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9fcdc0 00:22:22.128 [2024-07-15 12:03:35.526692] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:22.128 [2024-07-15 12:03:35.528246] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:22.128 [2024-07-15 12:03:35.528273] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:22.128 BaseBdev4 00:22:22.128 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:22.387 [2024-07-15 12:03:35.767281] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:22.387 [2024-07-15 12:03:35.768609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:22.387 [2024-07-15 12:03:35.768676] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:22.387 [2024-07-15 12:03:35.768742] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:22.387 [2024-07-15 12:03:35.768975] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9fb090 00:22:22.387 [2024-07-15 12:03:35.768987] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:22.387 [2024-07-15 12:03:35.769187] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9fdbf0 00:22:22.387 [2024-07-15 12:03:35.769337] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9fb090 00:22:22.387 [2024-07-15 12:03:35.769347] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9fb090 00:22:22.387 [2024-07-15 12:03:35.769454] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.387 12:03:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.646 12:03:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.646 "name": "raid_bdev1", 00:22:22.646 "uuid": "5e262ca8-d751-47f5-ae4a-19a309b7640e", 00:22:22.646 "strip_size_kb": 64, 00:22:22.646 "state": "online", 00:22:22.646 "raid_level": "concat", 00:22:22.646 "superblock": true, 00:22:22.646 "num_base_bdevs": 4, 00:22:22.646 "num_base_bdevs_discovered": 4, 00:22:22.646 "num_base_bdevs_operational": 4, 00:22:22.646 "base_bdevs_list": [ 00:22:22.646 { 00:22:22.646 "name": "BaseBdev1", 00:22:22.646 "uuid": "3a29f2aa-5422-5474-808b-4bd33b099642", 00:22:22.646 "is_configured": true, 00:22:22.646 "data_offset": 2048, 00:22:22.646 "data_size": 63488 00:22:22.646 }, 00:22:22.646 { 00:22:22.646 "name": "BaseBdev2", 00:22:22.646 "uuid": "5725d4ed-b51c-5297-a765-aee8e9495414", 00:22:22.647 "is_configured": true, 00:22:22.647 "data_offset": 2048, 00:22:22.647 "data_size": 63488 00:22:22.647 }, 00:22:22.647 { 00:22:22.647 "name": "BaseBdev3", 00:22:22.647 "uuid": "4089ad3a-f70e-545c-8bd4-3109286e0f49", 00:22:22.647 "is_configured": true, 00:22:22.647 "data_offset": 2048, 00:22:22.647 "data_size": 63488 00:22:22.647 }, 00:22:22.647 { 00:22:22.647 "name": "BaseBdev4", 00:22:22.647 "uuid": "1e22e5f3-8e4c-51a5-93ad-e50d3d326392", 00:22:22.647 "is_configured": true, 00:22:22.647 "data_offset": 2048, 00:22:22.647 "data_size": 63488 00:22:22.647 } 00:22:22.647 ] 00:22:22.647 }' 00:22:22.647 12:03:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.647 12:03:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:23.214 12:03:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:23.214 12:03:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:23.214 [2024-07-15 12:03:36.714038] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9fda40 00:22:24.152 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:24.411 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:24.411 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:22:24.411 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:24.411 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:24.411 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:24.411 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:24.411 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:24.411 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:24.412 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:24.412 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:24.412 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:24.412 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:24.412 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:24.412 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.412 12:03:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:24.671 12:03:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:24.671 "name": "raid_bdev1", 00:22:24.671 "uuid": "5e262ca8-d751-47f5-ae4a-19a309b7640e", 00:22:24.671 "strip_size_kb": 64, 00:22:24.671 "state": "online", 00:22:24.671 "raid_level": "concat", 00:22:24.671 "superblock": true, 00:22:24.671 "num_base_bdevs": 4, 00:22:24.671 "num_base_bdevs_discovered": 4, 00:22:24.671 "num_base_bdevs_operational": 4, 00:22:24.671 "base_bdevs_list": [ 00:22:24.671 { 00:22:24.671 "name": "BaseBdev1", 00:22:24.671 "uuid": "3a29f2aa-5422-5474-808b-4bd33b099642", 00:22:24.671 "is_configured": true, 00:22:24.671 "data_offset": 2048, 00:22:24.671 "data_size": 63488 00:22:24.671 }, 00:22:24.671 { 00:22:24.671 "name": "BaseBdev2", 00:22:24.671 "uuid": "5725d4ed-b51c-5297-a765-aee8e9495414", 00:22:24.671 "is_configured": true, 00:22:24.671 "data_offset": 2048, 00:22:24.671 "data_size": 63488 00:22:24.671 }, 00:22:24.671 { 00:22:24.671 "name": "BaseBdev3", 00:22:24.671 "uuid": "4089ad3a-f70e-545c-8bd4-3109286e0f49", 00:22:24.671 "is_configured": true, 00:22:24.671 "data_offset": 2048, 00:22:24.671 "data_size": 63488 00:22:24.671 }, 00:22:24.671 { 00:22:24.671 "name": "BaseBdev4", 00:22:24.671 "uuid": "1e22e5f3-8e4c-51a5-93ad-e50d3d326392", 00:22:24.671 "is_configured": true, 00:22:24.671 "data_offset": 2048, 00:22:24.671 "data_size": 63488 00:22:24.671 } 00:22:24.671 ] 00:22:24.671 }' 00:22:24.671 12:03:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:24.671 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:25.239 12:03:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:25.497 [2024-07-15 12:03:38.907458] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:25.497 [2024-07-15 12:03:38.907494] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:25.497 [2024-07-15 12:03:38.910662] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:25.497 [2024-07-15 12:03:38.910710] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:25.497 [2024-07-15 12:03:38.910748] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:25.497 [2024-07-15 12:03:38.910759] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9fb090 name raid_bdev1, state offline 00:22:25.497 0 00:22:25.497 12:03:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1545144 00:22:25.497 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1545144 ']' 00:22:25.497 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1545144 00:22:25.498 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:22:25.498 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:25.498 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1545144 00:22:25.498 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:25.498 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:25.498 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1545144' 00:22:25.498 killing process with pid 1545144 00:22:25.498 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1545144 00:22:25.498 [2024-07-15 12:03:38.991790] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:25.498 12:03:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1545144 00:22:25.498 [2024-07-15 12:03:39.023985] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.OgfYjGzaM1 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:22:25.757 00:22:25.757 real 0m7.727s 00:22:25.757 user 0m12.309s 00:22:25.757 sys 0m1.392s 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:25.757 12:03:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:25.757 ************************************ 00:22:25.757 END TEST raid_read_error_test 00:22:25.757 ************************************ 00:22:25.757 12:03:39 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:25.757 12:03:39 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:22:25.757 12:03:39 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:25.757 12:03:39 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:25.757 12:03:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:26.017 ************************************ 00:22:26.017 START TEST raid_write_error_test 00:22:26.017 ************************************ 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test concat 4 write 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.foItAEn1wk 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1546295 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1546295 /var/tmp/spdk-raid.sock 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1546295 ']' 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:26.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:26.017 12:03:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:26.017 [2024-07-15 12:03:39.437415] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:22:26.017 [2024-07-15 12:03:39.437486] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1546295 ] 00:22:26.017 [2024-07-15 12:03:39.569099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:26.277 [2024-07-15 12:03:39.670658] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:26.277 [2024-07-15 12:03:39.733594] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:26.277 [2024-07-15 12:03:39.733637] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:26.845 12:03:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:26.845 12:03:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:22:26.845 12:03:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:26.845 12:03:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:27.104 BaseBdev1_malloc 00:22:27.104 12:03:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:27.364 true 00:22:27.364 12:03:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:27.364 [2024-07-15 12:03:40.875346] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:27.364 [2024-07-15 12:03:40.875400] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:27.364 [2024-07-15 12:03:40.875418] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17da4e0 00:22:27.364 [2024-07-15 12:03:40.875431] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:27.364 [2024-07-15 12:03:40.877069] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:27.364 [2024-07-15 12:03:40.877097] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:27.364 BaseBdev1 00:22:27.364 12:03:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:27.364 12:03:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:27.623 BaseBdev2_malloc 00:22:27.623 12:03:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:27.882 true 00:22:27.882 12:03:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:27.882 [2024-07-15 12:03:41.397400] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:27.882 [2024-07-15 12:03:41.397449] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:27.882 [2024-07-15 12:03:41.397466] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17df7b0 00:22:27.882 [2024-07-15 12:03:41.397478] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:27.882 [2024-07-15 12:03:41.399115] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:27.882 [2024-07-15 12:03:41.399143] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:27.882 BaseBdev2 00:22:27.882 12:03:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:27.882 12:03:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:28.140 BaseBdev3_malloc 00:22:28.140 12:03:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:28.397 true 00:22:28.397 12:03:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:28.654 [2024-07-15 12:03:42.019752] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:28.654 [2024-07-15 12:03:42.019804] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:28.654 [2024-07-15 12:03:42.019825] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17e18f0 00:22:28.654 [2024-07-15 12:03:42.019838] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:28.654 [2024-07-15 12:03:42.021244] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:28.654 [2024-07-15 12:03:42.021270] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:28.654 BaseBdev3 00:22:28.654 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:22:28.654 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:28.654 BaseBdev4_malloc 00:22:28.654 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:28.911 true 00:22:28.911 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:29.169 [2024-07-15 12:03:42.545707] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:29.169 [2024-07-15 12:03:42.545749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:29.169 [2024-07-15 12:03:42.545768] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17e3dc0 00:22:29.170 [2024-07-15 12:03:42.545781] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:29.170 [2024-07-15 12:03:42.547207] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:29.170 [2024-07-15 12:03:42.547235] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:29.170 BaseBdev4 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:29.170 [2024-07-15 12:03:42.710176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:29.170 [2024-07-15 12:03:42.711393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:29.170 [2024-07-15 12:03:42.711460] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:29.170 [2024-07-15 12:03:42.711518] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:29.170 [2024-07-15 12:03:42.711752] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17e2090 00:22:29.170 [2024-07-15 12:03:42.711763] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:22:29.170 [2024-07-15 12:03:42.711947] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17e4bf0 00:22:29.170 [2024-07-15 12:03:42.712091] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17e2090 00:22:29.170 [2024-07-15 12:03:42.712101] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17e2090 00:22:29.170 [2024-07-15 12:03:42.712198] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.170 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:29.428 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:29.428 "name": "raid_bdev1", 00:22:29.428 "uuid": "6c13dcf1-4b73-4bc6-85b3-c2f9eecc0617", 00:22:29.428 "strip_size_kb": 64, 00:22:29.428 "state": "online", 00:22:29.428 "raid_level": "concat", 00:22:29.428 "superblock": true, 00:22:29.428 "num_base_bdevs": 4, 00:22:29.428 "num_base_bdevs_discovered": 4, 00:22:29.428 "num_base_bdevs_operational": 4, 00:22:29.428 "base_bdevs_list": [ 00:22:29.428 { 00:22:29.428 "name": "BaseBdev1", 00:22:29.428 "uuid": "4464bb4e-8d5f-5361-a5e5-9cd0cdb75100", 00:22:29.428 "is_configured": true, 00:22:29.429 "data_offset": 2048, 00:22:29.429 "data_size": 63488 00:22:29.429 }, 00:22:29.429 { 00:22:29.429 "name": "BaseBdev2", 00:22:29.429 "uuid": "1c33e732-e166-54f7-9bb2-596c9a5a700a", 00:22:29.429 "is_configured": true, 00:22:29.429 "data_offset": 2048, 00:22:29.429 "data_size": 63488 00:22:29.429 }, 00:22:29.429 { 00:22:29.429 "name": "BaseBdev3", 00:22:29.429 "uuid": "562868b2-7221-57e5-89eb-a17bcca7f62f", 00:22:29.429 "is_configured": true, 00:22:29.429 "data_offset": 2048, 00:22:29.429 "data_size": 63488 00:22:29.429 }, 00:22:29.429 { 00:22:29.429 "name": "BaseBdev4", 00:22:29.429 "uuid": "088add14-debf-5a7b-beda-89882477f712", 00:22:29.429 "is_configured": true, 00:22:29.429 "data_offset": 2048, 00:22:29.429 "data_size": 63488 00:22:29.429 } 00:22:29.429 ] 00:22:29.429 }' 00:22:29.429 12:03:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:29.429 12:03:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:29.996 12:03:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:22:29.996 12:03:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:30.255 [2024-07-15 12:03:43.600822] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17e4a40 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.192 12:03:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:31.452 12:03:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:31.452 "name": "raid_bdev1", 00:22:31.452 "uuid": "6c13dcf1-4b73-4bc6-85b3-c2f9eecc0617", 00:22:31.452 "strip_size_kb": 64, 00:22:31.452 "state": "online", 00:22:31.452 "raid_level": "concat", 00:22:31.452 "superblock": true, 00:22:31.452 "num_base_bdevs": 4, 00:22:31.452 "num_base_bdevs_discovered": 4, 00:22:31.452 "num_base_bdevs_operational": 4, 00:22:31.452 "base_bdevs_list": [ 00:22:31.452 { 00:22:31.452 "name": "BaseBdev1", 00:22:31.452 "uuid": "4464bb4e-8d5f-5361-a5e5-9cd0cdb75100", 00:22:31.452 "is_configured": true, 00:22:31.452 "data_offset": 2048, 00:22:31.452 "data_size": 63488 00:22:31.452 }, 00:22:31.452 { 00:22:31.452 "name": "BaseBdev2", 00:22:31.452 "uuid": "1c33e732-e166-54f7-9bb2-596c9a5a700a", 00:22:31.452 "is_configured": true, 00:22:31.452 "data_offset": 2048, 00:22:31.452 "data_size": 63488 00:22:31.452 }, 00:22:31.452 { 00:22:31.452 "name": "BaseBdev3", 00:22:31.452 "uuid": "562868b2-7221-57e5-89eb-a17bcca7f62f", 00:22:31.452 "is_configured": true, 00:22:31.452 "data_offset": 2048, 00:22:31.452 "data_size": 63488 00:22:31.452 }, 00:22:31.452 { 00:22:31.452 "name": "BaseBdev4", 00:22:31.452 "uuid": "088add14-debf-5a7b-beda-89882477f712", 00:22:31.452 "is_configured": true, 00:22:31.452 "data_offset": 2048, 00:22:31.452 "data_size": 63488 00:22:31.452 } 00:22:31.452 ] 00:22:31.452 }' 00:22:31.452 12:03:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:31.452 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:32.390 12:03:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:32.390 [2024-07-15 12:03:45.912135] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:32.390 [2024-07-15 12:03:45.912177] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:32.390 [2024-07-15 12:03:45.915356] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:32.390 [2024-07-15 12:03:45.915395] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:32.390 [2024-07-15 12:03:45.915433] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:32.390 [2024-07-15 12:03:45.915445] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17e2090 name raid_bdev1, state offline 00:22:32.390 0 00:22:32.390 12:03:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1546295 00:22:32.390 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1546295 ']' 00:22:32.390 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1546295 00:22:32.390 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:22:32.390 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:22:32.390 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1546295 00:22:32.650 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:22:32.650 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:22:32.650 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1546295' 00:22:32.650 killing process with pid 1546295 00:22:32.650 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1546295 00:22:32.650 [2024-07-15 12:03:45.998590] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:32.650 12:03:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1546295 00:22:32.650 [2024-07-15 12:03:46.030925] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.foItAEn1wk 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.43 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.43 != \0\.\0\0 ]] 00:22:32.910 00:22:32.910 real 0m6.916s 00:22:32.910 user 0m10.816s 00:22:32.910 sys 0m1.284s 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:22:32.910 12:03:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:32.910 ************************************ 00:22:32.910 END TEST raid_write_error_test 00:22:32.910 ************************************ 00:22:32.910 12:03:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:22:32.910 12:03:46 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:22:32.910 12:03:46 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:22:32.910 12:03:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:22:32.910 12:03:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:22:32.910 12:03:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:32.910 ************************************ 00:22:32.910 START TEST raid_state_function_test 00:22:32.910 ************************************ 00:22:32.910 12:03:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 false 00:22:32.910 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=1547270 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1547270' 00:22:32.911 Process raid pid: 1547270 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 1547270 /var/tmp/spdk-raid.sock 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@829 -- # '[' -z 1547270 ']' 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:32.911 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:22:32.911 12:03:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:32.911 [2024-07-15 12:03:46.427671] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:22:32.911 [2024-07-15 12:03:46.427740] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:22:33.170 [2024-07-15 12:03:46.542075] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:33.170 [2024-07-15 12:03:46.647258] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:33.170 [2024-07-15 12:03:46.706829] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:33.170 [2024-07-15 12:03:46.706875] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@862 -- # return 0 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:34.110 [2024-07-15 12:03:47.579845] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:34.110 [2024-07-15 12:03:47.579893] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:34.110 [2024-07-15 12:03:47.579904] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:34.110 [2024-07-15 12:03:47.579916] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:34.110 [2024-07-15 12:03:47.579924] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:34.110 [2024-07-15 12:03:47.579935] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:34.110 [2024-07-15 12:03:47.579947] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:34.110 [2024-07-15 12:03:47.579958] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.110 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:34.369 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:34.369 "name": "Existed_Raid", 00:22:34.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.369 "strip_size_kb": 0, 00:22:34.369 "state": "configuring", 00:22:34.369 "raid_level": "raid1", 00:22:34.369 "superblock": false, 00:22:34.369 "num_base_bdevs": 4, 00:22:34.369 "num_base_bdevs_discovered": 0, 00:22:34.369 "num_base_bdevs_operational": 4, 00:22:34.369 "base_bdevs_list": [ 00:22:34.369 { 00:22:34.369 "name": "BaseBdev1", 00:22:34.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.369 "is_configured": false, 00:22:34.369 "data_offset": 0, 00:22:34.369 "data_size": 0 00:22:34.369 }, 00:22:34.369 { 00:22:34.369 "name": "BaseBdev2", 00:22:34.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.369 "is_configured": false, 00:22:34.369 "data_offset": 0, 00:22:34.369 "data_size": 0 00:22:34.369 }, 00:22:34.369 { 00:22:34.369 "name": "BaseBdev3", 00:22:34.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.369 "is_configured": false, 00:22:34.369 "data_offset": 0, 00:22:34.369 "data_size": 0 00:22:34.369 }, 00:22:34.369 { 00:22:34.369 "name": "BaseBdev4", 00:22:34.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:34.369 "is_configured": false, 00:22:34.369 "data_offset": 0, 00:22:34.369 "data_size": 0 00:22:34.369 } 00:22:34.369 ] 00:22:34.369 }' 00:22:34.369 12:03:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:34.369 12:03:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:34.938 12:03:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:35.198 [2024-07-15 12:03:48.594454] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:35.198 [2024-07-15 12:03:48.594486] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd55b20 name Existed_Raid, state configuring 00:22:35.198 12:03:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:35.457 [2024-07-15 12:03:48.855159] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:35.457 [2024-07-15 12:03:48.855187] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:35.457 [2024-07-15 12:03:48.855197] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:35.457 [2024-07-15 12:03:48.855208] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:35.457 [2024-07-15 12:03:48.855216] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:35.457 [2024-07-15 12:03:48.855227] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:35.457 [2024-07-15 12:03:48.855236] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:35.457 [2024-07-15 12:03:48.855246] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:35.457 12:03:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:35.716 [2024-07-15 12:03:49.109583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:35.716 BaseBdev1 00:22:35.716 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:22:35.716 12:03:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:35.716 12:03:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:35.716 12:03:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:35.716 12:03:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:35.716 12:03:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:35.716 12:03:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:35.975 12:03:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:36.234 [ 00:22:36.234 { 00:22:36.234 "name": "BaseBdev1", 00:22:36.235 "aliases": [ 00:22:36.235 "8ea34a67-dc33-4537-a06e-4a803b550cbb" 00:22:36.235 ], 00:22:36.235 "product_name": "Malloc disk", 00:22:36.235 "block_size": 512, 00:22:36.235 "num_blocks": 65536, 00:22:36.235 "uuid": "8ea34a67-dc33-4537-a06e-4a803b550cbb", 00:22:36.235 "assigned_rate_limits": { 00:22:36.235 "rw_ios_per_sec": 0, 00:22:36.235 "rw_mbytes_per_sec": 0, 00:22:36.235 "r_mbytes_per_sec": 0, 00:22:36.235 "w_mbytes_per_sec": 0 00:22:36.235 }, 00:22:36.235 "claimed": true, 00:22:36.235 "claim_type": "exclusive_write", 00:22:36.235 "zoned": false, 00:22:36.235 "supported_io_types": { 00:22:36.235 "read": true, 00:22:36.235 "write": true, 00:22:36.235 "unmap": true, 00:22:36.235 "flush": true, 00:22:36.235 "reset": true, 00:22:36.235 "nvme_admin": false, 00:22:36.235 "nvme_io": false, 00:22:36.235 "nvme_io_md": false, 00:22:36.235 "write_zeroes": true, 00:22:36.235 "zcopy": true, 00:22:36.235 "get_zone_info": false, 00:22:36.235 "zone_management": false, 00:22:36.235 "zone_append": false, 00:22:36.235 "compare": false, 00:22:36.235 "compare_and_write": false, 00:22:36.235 "abort": true, 00:22:36.235 "seek_hole": false, 00:22:36.235 "seek_data": false, 00:22:36.235 "copy": true, 00:22:36.235 "nvme_iov_md": false 00:22:36.235 }, 00:22:36.235 "memory_domains": [ 00:22:36.235 { 00:22:36.235 "dma_device_id": "system", 00:22:36.235 "dma_device_type": 1 00:22:36.235 }, 00:22:36.235 { 00:22:36.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:36.235 "dma_device_type": 2 00:22:36.235 } 00:22:36.235 ], 00:22:36.235 "driver_specific": {} 00:22:36.235 } 00:22:36.235 ] 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:36.235 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:36.494 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:36.494 "name": "Existed_Raid", 00:22:36.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.494 "strip_size_kb": 0, 00:22:36.494 "state": "configuring", 00:22:36.494 "raid_level": "raid1", 00:22:36.494 "superblock": false, 00:22:36.494 "num_base_bdevs": 4, 00:22:36.494 "num_base_bdevs_discovered": 1, 00:22:36.494 "num_base_bdevs_operational": 4, 00:22:36.494 "base_bdevs_list": [ 00:22:36.494 { 00:22:36.494 "name": "BaseBdev1", 00:22:36.494 "uuid": "8ea34a67-dc33-4537-a06e-4a803b550cbb", 00:22:36.494 "is_configured": true, 00:22:36.494 "data_offset": 0, 00:22:36.494 "data_size": 65536 00:22:36.494 }, 00:22:36.494 { 00:22:36.494 "name": "BaseBdev2", 00:22:36.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.494 "is_configured": false, 00:22:36.494 "data_offset": 0, 00:22:36.494 "data_size": 0 00:22:36.494 }, 00:22:36.494 { 00:22:36.494 "name": "BaseBdev3", 00:22:36.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.494 "is_configured": false, 00:22:36.494 "data_offset": 0, 00:22:36.494 "data_size": 0 00:22:36.494 }, 00:22:36.494 { 00:22:36.494 "name": "BaseBdev4", 00:22:36.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:36.494 "is_configured": false, 00:22:36.494 "data_offset": 0, 00:22:36.494 "data_size": 0 00:22:36.494 } 00:22:36.494 ] 00:22:36.494 }' 00:22:36.494 12:03:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:36.494 12:03:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:37.061 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:22:37.319 [2024-07-15 12:03:50.713832] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:22:37.319 [2024-07-15 12:03:50.713879] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd55390 name Existed_Raid, state configuring 00:22:37.319 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:37.577 [2024-07-15 12:03:50.958502] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:37.577 [2024-07-15 12:03:50.959947] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:22:37.577 [2024-07-15 12:03:50.959979] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:22:37.577 [2024-07-15 12:03:50.959989] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:22:37.577 [2024-07-15 12:03:50.960000] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:22:37.577 [2024-07-15 12:03:50.960009] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:22:37.577 [2024-07-15 12:03:50.960020] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.577 12:03:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:37.837 12:03:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:37.837 "name": "Existed_Raid", 00:22:37.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.837 "strip_size_kb": 0, 00:22:37.837 "state": "configuring", 00:22:37.837 "raid_level": "raid1", 00:22:37.837 "superblock": false, 00:22:37.837 "num_base_bdevs": 4, 00:22:37.837 "num_base_bdevs_discovered": 1, 00:22:37.837 "num_base_bdevs_operational": 4, 00:22:37.837 "base_bdevs_list": [ 00:22:37.837 { 00:22:37.837 "name": "BaseBdev1", 00:22:37.837 "uuid": "8ea34a67-dc33-4537-a06e-4a803b550cbb", 00:22:37.837 "is_configured": true, 00:22:37.837 "data_offset": 0, 00:22:37.837 "data_size": 65536 00:22:37.837 }, 00:22:37.837 { 00:22:37.837 "name": "BaseBdev2", 00:22:37.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.837 "is_configured": false, 00:22:37.837 "data_offset": 0, 00:22:37.837 "data_size": 0 00:22:37.837 }, 00:22:37.837 { 00:22:37.837 "name": "BaseBdev3", 00:22:37.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.837 "is_configured": false, 00:22:37.837 "data_offset": 0, 00:22:37.837 "data_size": 0 00:22:37.837 }, 00:22:37.837 { 00:22:37.837 "name": "BaseBdev4", 00:22:37.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:37.837 "is_configured": false, 00:22:37.837 "data_offset": 0, 00:22:37.837 "data_size": 0 00:22:37.837 } 00:22:37.837 ] 00:22:37.837 }' 00:22:37.837 12:03:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:37.837 12:03:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:38.810 12:03:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:38.810 [2024-07-15 12:03:52.353646] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:38.810 BaseBdev2 00:22:38.810 12:03:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:22:38.810 12:03:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:38.810 12:03:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:38.810 12:03:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:38.810 12:03:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:38.810 12:03:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:38.810 12:03:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:39.379 12:03:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:39.638 [ 00:22:39.638 { 00:22:39.638 "name": "BaseBdev2", 00:22:39.638 "aliases": [ 00:22:39.638 "531e584c-3910-4605-b41b-1d38d1a0104d" 00:22:39.638 ], 00:22:39.638 "product_name": "Malloc disk", 00:22:39.638 "block_size": 512, 00:22:39.638 "num_blocks": 65536, 00:22:39.638 "uuid": "531e584c-3910-4605-b41b-1d38d1a0104d", 00:22:39.638 "assigned_rate_limits": { 00:22:39.638 "rw_ios_per_sec": 0, 00:22:39.638 "rw_mbytes_per_sec": 0, 00:22:39.638 "r_mbytes_per_sec": 0, 00:22:39.638 "w_mbytes_per_sec": 0 00:22:39.638 }, 00:22:39.638 "claimed": true, 00:22:39.638 "claim_type": "exclusive_write", 00:22:39.638 "zoned": false, 00:22:39.638 "supported_io_types": { 00:22:39.638 "read": true, 00:22:39.638 "write": true, 00:22:39.638 "unmap": true, 00:22:39.638 "flush": true, 00:22:39.638 "reset": true, 00:22:39.638 "nvme_admin": false, 00:22:39.638 "nvme_io": false, 00:22:39.638 "nvme_io_md": false, 00:22:39.638 "write_zeroes": true, 00:22:39.638 "zcopy": true, 00:22:39.638 "get_zone_info": false, 00:22:39.638 "zone_management": false, 00:22:39.638 "zone_append": false, 00:22:39.638 "compare": false, 00:22:39.638 "compare_and_write": false, 00:22:39.638 "abort": true, 00:22:39.638 "seek_hole": false, 00:22:39.638 "seek_data": false, 00:22:39.638 "copy": true, 00:22:39.638 "nvme_iov_md": false 00:22:39.638 }, 00:22:39.638 "memory_domains": [ 00:22:39.638 { 00:22:39.638 "dma_device_id": "system", 00:22:39.638 "dma_device_type": 1 00:22:39.638 }, 00:22:39.638 { 00:22:39.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:39.638 "dma_device_type": 2 00:22:39.638 } 00:22:39.638 ], 00:22:39.638 "driver_specific": {} 00:22:39.638 } 00:22:39.638 ] 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.638 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:39.896 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:39.896 "name": "Existed_Raid", 00:22:39.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.896 "strip_size_kb": 0, 00:22:39.896 "state": "configuring", 00:22:39.896 "raid_level": "raid1", 00:22:39.896 "superblock": false, 00:22:39.896 "num_base_bdevs": 4, 00:22:39.896 "num_base_bdevs_discovered": 2, 00:22:39.896 "num_base_bdevs_operational": 4, 00:22:39.896 "base_bdevs_list": [ 00:22:39.896 { 00:22:39.896 "name": "BaseBdev1", 00:22:39.896 "uuid": "8ea34a67-dc33-4537-a06e-4a803b550cbb", 00:22:39.896 "is_configured": true, 00:22:39.896 "data_offset": 0, 00:22:39.896 "data_size": 65536 00:22:39.896 }, 00:22:39.896 { 00:22:39.896 "name": "BaseBdev2", 00:22:39.896 "uuid": "531e584c-3910-4605-b41b-1d38d1a0104d", 00:22:39.896 "is_configured": true, 00:22:39.896 "data_offset": 0, 00:22:39.896 "data_size": 65536 00:22:39.896 }, 00:22:39.896 { 00:22:39.896 "name": "BaseBdev3", 00:22:39.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.896 "is_configured": false, 00:22:39.896 "data_offset": 0, 00:22:39.896 "data_size": 0 00:22:39.896 }, 00:22:39.896 { 00:22:39.896 "name": "BaseBdev4", 00:22:39.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:39.896 "is_configured": false, 00:22:39.896 "data_offset": 0, 00:22:39.896 "data_size": 0 00:22:39.896 } 00:22:39.896 ] 00:22:39.896 }' 00:22:39.896 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:39.896 12:03:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:40.463 12:03:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:40.722 [2024-07-15 12:03:54.230184] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:40.722 BaseBdev3 00:22:40.722 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:22:40.722 12:03:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:40.722 12:03:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:40.722 12:03:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:40.722 12:03:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:40.722 12:03:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:40.722 12:03:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:40.980 12:03:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:41.239 [ 00:22:41.239 { 00:22:41.239 "name": "BaseBdev3", 00:22:41.239 "aliases": [ 00:22:41.239 "57cb55b5-0a0a-4e2a-a4ed-01d7882be280" 00:22:41.239 ], 00:22:41.239 "product_name": "Malloc disk", 00:22:41.239 "block_size": 512, 00:22:41.239 "num_blocks": 65536, 00:22:41.239 "uuid": "57cb55b5-0a0a-4e2a-a4ed-01d7882be280", 00:22:41.239 "assigned_rate_limits": { 00:22:41.239 "rw_ios_per_sec": 0, 00:22:41.239 "rw_mbytes_per_sec": 0, 00:22:41.239 "r_mbytes_per_sec": 0, 00:22:41.239 "w_mbytes_per_sec": 0 00:22:41.239 }, 00:22:41.239 "claimed": true, 00:22:41.239 "claim_type": "exclusive_write", 00:22:41.239 "zoned": false, 00:22:41.239 "supported_io_types": { 00:22:41.239 "read": true, 00:22:41.239 "write": true, 00:22:41.239 "unmap": true, 00:22:41.239 "flush": true, 00:22:41.239 "reset": true, 00:22:41.239 "nvme_admin": false, 00:22:41.239 "nvme_io": false, 00:22:41.239 "nvme_io_md": false, 00:22:41.239 "write_zeroes": true, 00:22:41.239 "zcopy": true, 00:22:41.239 "get_zone_info": false, 00:22:41.239 "zone_management": false, 00:22:41.239 "zone_append": false, 00:22:41.239 "compare": false, 00:22:41.239 "compare_and_write": false, 00:22:41.239 "abort": true, 00:22:41.239 "seek_hole": false, 00:22:41.239 "seek_data": false, 00:22:41.239 "copy": true, 00:22:41.239 "nvme_iov_md": false 00:22:41.239 }, 00:22:41.239 "memory_domains": [ 00:22:41.239 { 00:22:41.239 "dma_device_id": "system", 00:22:41.239 "dma_device_type": 1 00:22:41.239 }, 00:22:41.239 { 00:22:41.239 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:41.239 "dma_device_type": 2 00:22:41.239 } 00:22:41.239 ], 00:22:41.239 "driver_specific": {} 00:22:41.239 } 00:22:41.239 ] 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:41.239 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:41.498 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:41.498 "name": "Existed_Raid", 00:22:41.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.498 "strip_size_kb": 0, 00:22:41.498 "state": "configuring", 00:22:41.498 "raid_level": "raid1", 00:22:41.498 "superblock": false, 00:22:41.498 "num_base_bdevs": 4, 00:22:41.498 "num_base_bdevs_discovered": 3, 00:22:41.498 "num_base_bdevs_operational": 4, 00:22:41.498 "base_bdevs_list": [ 00:22:41.498 { 00:22:41.498 "name": "BaseBdev1", 00:22:41.498 "uuid": "8ea34a67-dc33-4537-a06e-4a803b550cbb", 00:22:41.498 "is_configured": true, 00:22:41.498 "data_offset": 0, 00:22:41.498 "data_size": 65536 00:22:41.498 }, 00:22:41.498 { 00:22:41.498 "name": "BaseBdev2", 00:22:41.498 "uuid": "531e584c-3910-4605-b41b-1d38d1a0104d", 00:22:41.498 "is_configured": true, 00:22:41.498 "data_offset": 0, 00:22:41.498 "data_size": 65536 00:22:41.498 }, 00:22:41.498 { 00:22:41.498 "name": "BaseBdev3", 00:22:41.498 "uuid": "57cb55b5-0a0a-4e2a-a4ed-01d7882be280", 00:22:41.498 "is_configured": true, 00:22:41.498 "data_offset": 0, 00:22:41.498 "data_size": 65536 00:22:41.498 }, 00:22:41.498 { 00:22:41.498 "name": "BaseBdev4", 00:22:41.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:41.498 "is_configured": false, 00:22:41.498 "data_offset": 0, 00:22:41.498 "data_size": 0 00:22:41.498 } 00:22:41.498 ] 00:22:41.498 }' 00:22:41.498 12:03:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:41.498 12:03:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:42.063 12:03:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:42.322 [2024-07-15 12:03:55.757624] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:42.322 [2024-07-15 12:03:55.757665] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd564a0 00:22:42.322 [2024-07-15 12:03:55.757674] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:42.322 [2024-07-15 12:03:55.757949] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd560a0 00:22:42.322 [2024-07-15 12:03:55.758079] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd564a0 00:22:42.322 [2024-07-15 12:03:55.758089] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd564a0 00:22:42.322 [2024-07-15 12:03:55.758262] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:42.322 BaseBdev4 00:22:42.322 12:03:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:22:42.322 12:03:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:42.322 12:03:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:42.322 12:03:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:42.322 12:03:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:42.322 12:03:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:42.322 12:03:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:42.580 12:03:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:42.839 [ 00:22:42.839 { 00:22:42.839 "name": "BaseBdev4", 00:22:42.839 "aliases": [ 00:22:42.839 "235ecbb1-90f8-498e-94b9-e28ee537870e" 00:22:42.839 ], 00:22:42.839 "product_name": "Malloc disk", 00:22:42.839 "block_size": 512, 00:22:42.839 "num_blocks": 65536, 00:22:42.839 "uuid": "235ecbb1-90f8-498e-94b9-e28ee537870e", 00:22:42.839 "assigned_rate_limits": { 00:22:42.839 "rw_ios_per_sec": 0, 00:22:42.839 "rw_mbytes_per_sec": 0, 00:22:42.839 "r_mbytes_per_sec": 0, 00:22:42.839 "w_mbytes_per_sec": 0 00:22:42.839 }, 00:22:42.839 "claimed": true, 00:22:42.839 "claim_type": "exclusive_write", 00:22:42.839 "zoned": false, 00:22:42.839 "supported_io_types": { 00:22:42.839 "read": true, 00:22:42.839 "write": true, 00:22:42.839 "unmap": true, 00:22:42.839 "flush": true, 00:22:42.839 "reset": true, 00:22:42.839 "nvme_admin": false, 00:22:42.839 "nvme_io": false, 00:22:42.839 "nvme_io_md": false, 00:22:42.839 "write_zeroes": true, 00:22:42.839 "zcopy": true, 00:22:42.839 "get_zone_info": false, 00:22:42.839 "zone_management": false, 00:22:42.839 "zone_append": false, 00:22:42.839 "compare": false, 00:22:42.839 "compare_and_write": false, 00:22:42.839 "abort": true, 00:22:42.839 "seek_hole": false, 00:22:42.839 "seek_data": false, 00:22:42.839 "copy": true, 00:22:42.839 "nvme_iov_md": false 00:22:42.839 }, 00:22:42.839 "memory_domains": [ 00:22:42.839 { 00:22:42.839 "dma_device_id": "system", 00:22:42.839 "dma_device_type": 1 00:22:42.839 }, 00:22:42.839 { 00:22:42.839 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:42.839 "dma_device_type": 2 00:22:42.839 } 00:22:42.839 ], 00:22:42.839 "driver_specific": {} 00:22:42.839 } 00:22:42.839 ] 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.839 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:43.097 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:43.097 "name": "Existed_Raid", 00:22:43.097 "uuid": "e5a30d9b-7f98-4f2e-b9c0-8f7206ab9368", 00:22:43.097 "strip_size_kb": 0, 00:22:43.097 "state": "online", 00:22:43.097 "raid_level": "raid1", 00:22:43.098 "superblock": false, 00:22:43.098 "num_base_bdevs": 4, 00:22:43.098 "num_base_bdevs_discovered": 4, 00:22:43.098 "num_base_bdevs_operational": 4, 00:22:43.098 "base_bdevs_list": [ 00:22:43.098 { 00:22:43.098 "name": "BaseBdev1", 00:22:43.098 "uuid": "8ea34a67-dc33-4537-a06e-4a803b550cbb", 00:22:43.098 "is_configured": true, 00:22:43.098 "data_offset": 0, 00:22:43.098 "data_size": 65536 00:22:43.098 }, 00:22:43.098 { 00:22:43.098 "name": "BaseBdev2", 00:22:43.098 "uuid": "531e584c-3910-4605-b41b-1d38d1a0104d", 00:22:43.098 "is_configured": true, 00:22:43.098 "data_offset": 0, 00:22:43.098 "data_size": 65536 00:22:43.098 }, 00:22:43.098 { 00:22:43.098 "name": "BaseBdev3", 00:22:43.098 "uuid": "57cb55b5-0a0a-4e2a-a4ed-01d7882be280", 00:22:43.098 "is_configured": true, 00:22:43.098 "data_offset": 0, 00:22:43.098 "data_size": 65536 00:22:43.098 }, 00:22:43.098 { 00:22:43.098 "name": "BaseBdev4", 00:22:43.098 "uuid": "235ecbb1-90f8-498e-94b9-e28ee537870e", 00:22:43.098 "is_configured": true, 00:22:43.098 "data_offset": 0, 00:22:43.098 "data_size": 65536 00:22:43.098 } 00:22:43.098 ] 00:22:43.098 }' 00:22:43.098 12:03:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:43.098 12:03:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:43.665 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:22:43.665 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:22:43.666 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:22:43.666 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:22:43.666 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:22:43.666 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:22:43.666 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:22:43.666 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:22:43.925 [2024-07-15 12:03:57.322100] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:43.925 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:22:43.925 "name": "Existed_Raid", 00:22:43.925 "aliases": [ 00:22:43.925 "e5a30d9b-7f98-4f2e-b9c0-8f7206ab9368" 00:22:43.925 ], 00:22:43.925 "product_name": "Raid Volume", 00:22:43.925 "block_size": 512, 00:22:43.925 "num_blocks": 65536, 00:22:43.925 "uuid": "e5a30d9b-7f98-4f2e-b9c0-8f7206ab9368", 00:22:43.925 "assigned_rate_limits": { 00:22:43.925 "rw_ios_per_sec": 0, 00:22:43.925 "rw_mbytes_per_sec": 0, 00:22:43.925 "r_mbytes_per_sec": 0, 00:22:43.925 "w_mbytes_per_sec": 0 00:22:43.925 }, 00:22:43.925 "claimed": false, 00:22:43.925 "zoned": false, 00:22:43.925 "supported_io_types": { 00:22:43.925 "read": true, 00:22:43.925 "write": true, 00:22:43.925 "unmap": false, 00:22:43.925 "flush": false, 00:22:43.925 "reset": true, 00:22:43.925 "nvme_admin": false, 00:22:43.925 "nvme_io": false, 00:22:43.925 "nvme_io_md": false, 00:22:43.925 "write_zeroes": true, 00:22:43.925 "zcopy": false, 00:22:43.925 "get_zone_info": false, 00:22:43.925 "zone_management": false, 00:22:43.925 "zone_append": false, 00:22:43.925 "compare": false, 00:22:43.925 "compare_and_write": false, 00:22:43.925 "abort": false, 00:22:43.925 "seek_hole": false, 00:22:43.925 "seek_data": false, 00:22:43.925 "copy": false, 00:22:43.925 "nvme_iov_md": false 00:22:43.925 }, 00:22:43.925 "memory_domains": [ 00:22:43.925 { 00:22:43.925 "dma_device_id": "system", 00:22:43.925 "dma_device_type": 1 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.925 "dma_device_type": 2 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "dma_device_id": "system", 00:22:43.925 "dma_device_type": 1 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.925 "dma_device_type": 2 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "dma_device_id": "system", 00:22:43.925 "dma_device_type": 1 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.925 "dma_device_type": 2 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "dma_device_id": "system", 00:22:43.925 "dma_device_type": 1 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:43.925 "dma_device_type": 2 00:22:43.925 } 00:22:43.925 ], 00:22:43.925 "driver_specific": { 00:22:43.925 "raid": { 00:22:43.925 "uuid": "e5a30d9b-7f98-4f2e-b9c0-8f7206ab9368", 00:22:43.925 "strip_size_kb": 0, 00:22:43.925 "state": "online", 00:22:43.925 "raid_level": "raid1", 00:22:43.925 "superblock": false, 00:22:43.925 "num_base_bdevs": 4, 00:22:43.925 "num_base_bdevs_discovered": 4, 00:22:43.925 "num_base_bdevs_operational": 4, 00:22:43.925 "base_bdevs_list": [ 00:22:43.925 { 00:22:43.925 "name": "BaseBdev1", 00:22:43.925 "uuid": "8ea34a67-dc33-4537-a06e-4a803b550cbb", 00:22:43.925 "is_configured": true, 00:22:43.925 "data_offset": 0, 00:22:43.925 "data_size": 65536 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "name": "BaseBdev2", 00:22:43.925 "uuid": "531e584c-3910-4605-b41b-1d38d1a0104d", 00:22:43.925 "is_configured": true, 00:22:43.925 "data_offset": 0, 00:22:43.925 "data_size": 65536 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "name": "BaseBdev3", 00:22:43.925 "uuid": "57cb55b5-0a0a-4e2a-a4ed-01d7882be280", 00:22:43.925 "is_configured": true, 00:22:43.925 "data_offset": 0, 00:22:43.925 "data_size": 65536 00:22:43.925 }, 00:22:43.925 { 00:22:43.925 "name": "BaseBdev4", 00:22:43.925 "uuid": "235ecbb1-90f8-498e-94b9-e28ee537870e", 00:22:43.925 "is_configured": true, 00:22:43.925 "data_offset": 0, 00:22:43.925 "data_size": 65536 00:22:43.925 } 00:22:43.925 ] 00:22:43.925 } 00:22:43.925 } 00:22:43.925 }' 00:22:43.925 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:22:43.925 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:22:43.925 BaseBdev2 00:22:43.925 BaseBdev3 00:22:43.925 BaseBdev4' 00:22:43.925 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:43.925 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:22:43.925 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:44.184 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:44.184 "name": "BaseBdev1", 00:22:44.184 "aliases": [ 00:22:44.184 "8ea34a67-dc33-4537-a06e-4a803b550cbb" 00:22:44.184 ], 00:22:44.184 "product_name": "Malloc disk", 00:22:44.184 "block_size": 512, 00:22:44.184 "num_blocks": 65536, 00:22:44.184 "uuid": "8ea34a67-dc33-4537-a06e-4a803b550cbb", 00:22:44.184 "assigned_rate_limits": { 00:22:44.184 "rw_ios_per_sec": 0, 00:22:44.184 "rw_mbytes_per_sec": 0, 00:22:44.184 "r_mbytes_per_sec": 0, 00:22:44.184 "w_mbytes_per_sec": 0 00:22:44.184 }, 00:22:44.184 "claimed": true, 00:22:44.184 "claim_type": "exclusive_write", 00:22:44.184 "zoned": false, 00:22:44.184 "supported_io_types": { 00:22:44.184 "read": true, 00:22:44.184 "write": true, 00:22:44.184 "unmap": true, 00:22:44.184 "flush": true, 00:22:44.184 "reset": true, 00:22:44.184 "nvme_admin": false, 00:22:44.184 "nvme_io": false, 00:22:44.184 "nvme_io_md": false, 00:22:44.184 "write_zeroes": true, 00:22:44.184 "zcopy": true, 00:22:44.184 "get_zone_info": false, 00:22:44.184 "zone_management": false, 00:22:44.184 "zone_append": false, 00:22:44.184 "compare": false, 00:22:44.184 "compare_and_write": false, 00:22:44.184 "abort": true, 00:22:44.184 "seek_hole": false, 00:22:44.184 "seek_data": false, 00:22:44.184 "copy": true, 00:22:44.184 "nvme_iov_md": false 00:22:44.184 }, 00:22:44.184 "memory_domains": [ 00:22:44.184 { 00:22:44.184 "dma_device_id": "system", 00:22:44.184 "dma_device_type": 1 00:22:44.184 }, 00:22:44.184 { 00:22:44.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.184 "dma_device_type": 2 00:22:44.184 } 00:22:44.184 ], 00:22:44.184 "driver_specific": {} 00:22:44.184 }' 00:22:44.184 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:44.184 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:44.184 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:44.184 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:44.184 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:22:44.443 12:03:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:44.702 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:44.702 "name": "BaseBdev2", 00:22:44.702 "aliases": [ 00:22:44.702 "531e584c-3910-4605-b41b-1d38d1a0104d" 00:22:44.702 ], 00:22:44.702 "product_name": "Malloc disk", 00:22:44.702 "block_size": 512, 00:22:44.702 "num_blocks": 65536, 00:22:44.702 "uuid": "531e584c-3910-4605-b41b-1d38d1a0104d", 00:22:44.702 "assigned_rate_limits": { 00:22:44.702 "rw_ios_per_sec": 0, 00:22:44.702 "rw_mbytes_per_sec": 0, 00:22:44.702 "r_mbytes_per_sec": 0, 00:22:44.702 "w_mbytes_per_sec": 0 00:22:44.702 }, 00:22:44.702 "claimed": true, 00:22:44.702 "claim_type": "exclusive_write", 00:22:44.702 "zoned": false, 00:22:44.702 "supported_io_types": { 00:22:44.702 "read": true, 00:22:44.702 "write": true, 00:22:44.702 "unmap": true, 00:22:44.702 "flush": true, 00:22:44.702 "reset": true, 00:22:44.702 "nvme_admin": false, 00:22:44.702 "nvme_io": false, 00:22:44.702 "nvme_io_md": false, 00:22:44.702 "write_zeroes": true, 00:22:44.702 "zcopy": true, 00:22:44.702 "get_zone_info": false, 00:22:44.702 "zone_management": false, 00:22:44.702 "zone_append": false, 00:22:44.702 "compare": false, 00:22:44.702 "compare_and_write": false, 00:22:44.702 "abort": true, 00:22:44.702 "seek_hole": false, 00:22:44.702 "seek_data": false, 00:22:44.702 "copy": true, 00:22:44.702 "nvme_iov_md": false 00:22:44.702 }, 00:22:44.702 "memory_domains": [ 00:22:44.702 { 00:22:44.702 "dma_device_id": "system", 00:22:44.702 "dma_device_type": 1 00:22:44.702 }, 00:22:44.702 { 00:22:44.702 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:44.702 "dma_device_type": 2 00:22:44.702 } 00:22:44.702 ], 00:22:44.702 "driver_specific": {} 00:22:44.702 }' 00:22:44.702 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:44.702 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:44.961 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:44.961 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:44.961 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:44.961 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:44.961 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.961 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:44.961 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:44.961 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.220 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.220 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:45.220 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:45.220 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:22:45.220 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:45.478 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:45.478 "name": "BaseBdev3", 00:22:45.478 "aliases": [ 00:22:45.478 "57cb55b5-0a0a-4e2a-a4ed-01d7882be280" 00:22:45.478 ], 00:22:45.478 "product_name": "Malloc disk", 00:22:45.478 "block_size": 512, 00:22:45.478 "num_blocks": 65536, 00:22:45.478 "uuid": "57cb55b5-0a0a-4e2a-a4ed-01d7882be280", 00:22:45.478 "assigned_rate_limits": { 00:22:45.478 "rw_ios_per_sec": 0, 00:22:45.478 "rw_mbytes_per_sec": 0, 00:22:45.478 "r_mbytes_per_sec": 0, 00:22:45.478 "w_mbytes_per_sec": 0 00:22:45.478 }, 00:22:45.478 "claimed": true, 00:22:45.478 "claim_type": "exclusive_write", 00:22:45.478 "zoned": false, 00:22:45.478 "supported_io_types": { 00:22:45.478 "read": true, 00:22:45.478 "write": true, 00:22:45.478 "unmap": true, 00:22:45.478 "flush": true, 00:22:45.478 "reset": true, 00:22:45.478 "nvme_admin": false, 00:22:45.478 "nvme_io": false, 00:22:45.478 "nvme_io_md": false, 00:22:45.478 "write_zeroes": true, 00:22:45.478 "zcopy": true, 00:22:45.478 "get_zone_info": false, 00:22:45.478 "zone_management": false, 00:22:45.478 "zone_append": false, 00:22:45.478 "compare": false, 00:22:45.478 "compare_and_write": false, 00:22:45.478 "abort": true, 00:22:45.478 "seek_hole": false, 00:22:45.478 "seek_data": false, 00:22:45.478 "copy": true, 00:22:45.478 "nvme_iov_md": false 00:22:45.478 }, 00:22:45.478 "memory_domains": [ 00:22:45.478 { 00:22:45.478 "dma_device_id": "system", 00:22:45.478 "dma_device_type": 1 00:22:45.478 }, 00:22:45.478 { 00:22:45.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.479 "dma_device_type": 2 00:22:45.479 } 00:22:45.479 ], 00:22:45.479 "driver_specific": {} 00:22:45.479 }' 00:22:45.479 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.479 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.479 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:45.479 12:03:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.479 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:45.479 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:45.479 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.737 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:45.737 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:45.737 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.737 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:45.737 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:45.737 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:22:45.737 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:22:45.737 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:22:45.995 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:22:45.995 "name": "BaseBdev4", 00:22:45.995 "aliases": [ 00:22:45.995 "235ecbb1-90f8-498e-94b9-e28ee537870e" 00:22:45.995 ], 00:22:45.995 "product_name": "Malloc disk", 00:22:45.995 "block_size": 512, 00:22:45.995 "num_blocks": 65536, 00:22:45.995 "uuid": "235ecbb1-90f8-498e-94b9-e28ee537870e", 00:22:45.995 "assigned_rate_limits": { 00:22:45.995 "rw_ios_per_sec": 0, 00:22:45.995 "rw_mbytes_per_sec": 0, 00:22:45.995 "r_mbytes_per_sec": 0, 00:22:45.995 "w_mbytes_per_sec": 0 00:22:45.995 }, 00:22:45.995 "claimed": true, 00:22:45.995 "claim_type": "exclusive_write", 00:22:45.995 "zoned": false, 00:22:45.995 "supported_io_types": { 00:22:45.995 "read": true, 00:22:45.995 "write": true, 00:22:45.995 "unmap": true, 00:22:45.995 "flush": true, 00:22:45.995 "reset": true, 00:22:45.995 "nvme_admin": false, 00:22:45.995 "nvme_io": false, 00:22:45.995 "nvme_io_md": false, 00:22:45.995 "write_zeroes": true, 00:22:45.995 "zcopy": true, 00:22:45.995 "get_zone_info": false, 00:22:45.995 "zone_management": false, 00:22:45.995 "zone_append": false, 00:22:45.995 "compare": false, 00:22:45.995 "compare_and_write": false, 00:22:45.995 "abort": true, 00:22:45.996 "seek_hole": false, 00:22:45.996 "seek_data": false, 00:22:45.996 "copy": true, 00:22:45.996 "nvme_iov_md": false 00:22:45.996 }, 00:22:45.996 "memory_domains": [ 00:22:45.996 { 00:22:45.996 "dma_device_id": "system", 00:22:45.996 "dma_device_type": 1 00:22:45.996 }, 00:22:45.996 { 00:22:45.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:45.996 "dma_device_type": 2 00:22:45.996 } 00:22:45.996 ], 00:22:45.996 "driver_specific": {} 00:22:45.996 }' 00:22:45.996 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.996 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:22:45.996 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:22:45.996 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.253 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:22:46.253 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:22:46.254 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.254 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:22:46.254 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:22:46.254 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.254 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:22:46.254 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:22:46.254 12:03:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:46.517 [2024-07-15 12:04:00.057111] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.517 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:46.776 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:46.776 "name": "Existed_Raid", 00:22:46.776 "uuid": "e5a30d9b-7f98-4f2e-b9c0-8f7206ab9368", 00:22:46.776 "strip_size_kb": 0, 00:22:46.776 "state": "online", 00:22:46.776 "raid_level": "raid1", 00:22:46.776 "superblock": false, 00:22:46.776 "num_base_bdevs": 4, 00:22:46.776 "num_base_bdevs_discovered": 3, 00:22:46.776 "num_base_bdevs_operational": 3, 00:22:46.776 "base_bdevs_list": [ 00:22:46.776 { 00:22:46.776 "name": null, 00:22:46.776 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:46.776 "is_configured": false, 00:22:46.776 "data_offset": 0, 00:22:46.776 "data_size": 65536 00:22:46.776 }, 00:22:46.776 { 00:22:46.776 "name": "BaseBdev2", 00:22:46.776 "uuid": "531e584c-3910-4605-b41b-1d38d1a0104d", 00:22:46.776 "is_configured": true, 00:22:46.776 "data_offset": 0, 00:22:46.776 "data_size": 65536 00:22:46.776 }, 00:22:46.776 { 00:22:46.776 "name": "BaseBdev3", 00:22:46.776 "uuid": "57cb55b5-0a0a-4e2a-a4ed-01d7882be280", 00:22:46.776 "is_configured": true, 00:22:46.776 "data_offset": 0, 00:22:46.776 "data_size": 65536 00:22:46.776 }, 00:22:46.776 { 00:22:46.776 "name": "BaseBdev4", 00:22:46.776 "uuid": "235ecbb1-90f8-498e-94b9-e28ee537870e", 00:22:46.776 "is_configured": true, 00:22:46.776 "data_offset": 0, 00:22:46.776 "data_size": 65536 00:22:46.776 } 00:22:46.776 ] 00:22:46.776 }' 00:22:46.776 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:46.776 12:04:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:47.344 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:22:47.344 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:47.344 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.344 12:04:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:47.603 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:47.603 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:47.603 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:22:48.172 [2024-07-15 12:04:01.631364] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:48.172 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:48.172 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:48.172 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.172 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:48.431 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:48.431 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:48.431 12:04:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:22:48.691 [2024-07-15 12:04:02.149098] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:48.691 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:48.691 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:48.691 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.691 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:22:48.950 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:22:48.950 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:22:48.950 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:22:49.210 [2024-07-15 12:04:02.702964] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:22:49.210 [2024-07-15 12:04:02.703046] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:49.210 [2024-07-15 12:04:02.713751] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:49.210 [2024-07-15 12:04:02.713783] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:49.210 [2024-07-15 12:04:02.713794] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd564a0 name Existed_Raid, state offline 00:22:49.210 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:22:49.210 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:22:49.210 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.210 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:22:49.470 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:22:49.470 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:22:49.470 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:22:49.470 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:22:49.470 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:49.470 12:04:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:22:49.735 BaseBdev2 00:22:49.735 12:04:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:22:49.735 12:04:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:22:49.735 12:04:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:49.735 12:04:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:49.735 12:04:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:49.735 12:04:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:49.735 12:04:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:49.995 12:04:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:22:50.254 [ 00:22:50.254 { 00:22:50.254 "name": "BaseBdev2", 00:22:50.254 "aliases": [ 00:22:50.254 "8b67da11-a265-4306-8233-ecb64804f9f3" 00:22:50.254 ], 00:22:50.254 "product_name": "Malloc disk", 00:22:50.254 "block_size": 512, 00:22:50.254 "num_blocks": 65536, 00:22:50.254 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:22:50.254 "assigned_rate_limits": { 00:22:50.254 "rw_ios_per_sec": 0, 00:22:50.254 "rw_mbytes_per_sec": 0, 00:22:50.254 "r_mbytes_per_sec": 0, 00:22:50.254 "w_mbytes_per_sec": 0 00:22:50.254 }, 00:22:50.254 "claimed": false, 00:22:50.254 "zoned": false, 00:22:50.254 "supported_io_types": { 00:22:50.254 "read": true, 00:22:50.254 "write": true, 00:22:50.254 "unmap": true, 00:22:50.254 "flush": true, 00:22:50.254 "reset": true, 00:22:50.254 "nvme_admin": false, 00:22:50.254 "nvme_io": false, 00:22:50.254 "nvme_io_md": false, 00:22:50.254 "write_zeroes": true, 00:22:50.254 "zcopy": true, 00:22:50.254 "get_zone_info": false, 00:22:50.254 "zone_management": false, 00:22:50.254 "zone_append": false, 00:22:50.254 "compare": false, 00:22:50.254 "compare_and_write": false, 00:22:50.254 "abort": true, 00:22:50.254 "seek_hole": false, 00:22:50.254 "seek_data": false, 00:22:50.254 "copy": true, 00:22:50.254 "nvme_iov_md": false 00:22:50.254 }, 00:22:50.254 "memory_domains": [ 00:22:50.254 { 00:22:50.254 "dma_device_id": "system", 00:22:50.254 "dma_device_type": 1 00:22:50.254 }, 00:22:50.254 { 00:22:50.254 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:50.254 "dma_device_type": 2 00:22:50.254 } 00:22:50.254 ], 00:22:50.254 "driver_specific": {} 00:22:50.254 } 00:22:50.254 ] 00:22:50.254 12:04:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:50.254 12:04:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:50.254 12:04:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:50.254 12:04:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:22:50.512 BaseBdev3 00:22:50.512 12:04:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:22:50.512 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:22:50.512 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:50.512 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:50.512 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:50.512 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:50.512 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:50.771 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:22:51.030 [ 00:22:51.030 { 00:22:51.030 "name": "BaseBdev3", 00:22:51.030 "aliases": [ 00:22:51.030 "170c969e-a8bb-46cf-bd6d-6faac286de97" 00:22:51.030 ], 00:22:51.030 "product_name": "Malloc disk", 00:22:51.030 "block_size": 512, 00:22:51.030 "num_blocks": 65536, 00:22:51.030 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:22:51.030 "assigned_rate_limits": { 00:22:51.030 "rw_ios_per_sec": 0, 00:22:51.030 "rw_mbytes_per_sec": 0, 00:22:51.030 "r_mbytes_per_sec": 0, 00:22:51.030 "w_mbytes_per_sec": 0 00:22:51.030 }, 00:22:51.030 "claimed": false, 00:22:51.030 "zoned": false, 00:22:51.030 "supported_io_types": { 00:22:51.030 "read": true, 00:22:51.030 "write": true, 00:22:51.030 "unmap": true, 00:22:51.030 "flush": true, 00:22:51.030 "reset": true, 00:22:51.030 "nvme_admin": false, 00:22:51.030 "nvme_io": false, 00:22:51.030 "nvme_io_md": false, 00:22:51.030 "write_zeroes": true, 00:22:51.030 "zcopy": true, 00:22:51.030 "get_zone_info": false, 00:22:51.030 "zone_management": false, 00:22:51.030 "zone_append": false, 00:22:51.030 "compare": false, 00:22:51.030 "compare_and_write": false, 00:22:51.030 "abort": true, 00:22:51.030 "seek_hole": false, 00:22:51.030 "seek_data": false, 00:22:51.030 "copy": true, 00:22:51.030 "nvme_iov_md": false 00:22:51.030 }, 00:22:51.030 "memory_domains": [ 00:22:51.030 { 00:22:51.030 "dma_device_id": "system", 00:22:51.030 "dma_device_type": 1 00:22:51.030 }, 00:22:51.030 { 00:22:51.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.030 "dma_device_type": 2 00:22:51.030 } 00:22:51.030 ], 00:22:51.030 "driver_specific": {} 00:22:51.030 } 00:22:51.030 ] 00:22:51.030 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:51.030 12:04:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:51.030 12:04:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:51.030 12:04:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:22:51.289 BaseBdev4 00:22:51.289 12:04:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:22:51.289 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:22:51.289 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:51.289 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:51.289 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:51.289 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:51.289 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:51.548 12:04:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:22:51.808 [ 00:22:51.808 { 00:22:51.808 "name": "BaseBdev4", 00:22:51.808 "aliases": [ 00:22:51.808 "5407c2e5-a785-46f1-8ced-68f2fd675d7f" 00:22:51.808 ], 00:22:51.808 "product_name": "Malloc disk", 00:22:51.808 "block_size": 512, 00:22:51.808 "num_blocks": 65536, 00:22:51.808 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:22:51.808 "assigned_rate_limits": { 00:22:51.808 "rw_ios_per_sec": 0, 00:22:51.808 "rw_mbytes_per_sec": 0, 00:22:51.808 "r_mbytes_per_sec": 0, 00:22:51.808 "w_mbytes_per_sec": 0 00:22:51.808 }, 00:22:51.808 "claimed": false, 00:22:51.808 "zoned": false, 00:22:51.808 "supported_io_types": { 00:22:51.808 "read": true, 00:22:51.808 "write": true, 00:22:51.808 "unmap": true, 00:22:51.808 "flush": true, 00:22:51.808 "reset": true, 00:22:51.808 "nvme_admin": false, 00:22:51.808 "nvme_io": false, 00:22:51.808 "nvme_io_md": false, 00:22:51.808 "write_zeroes": true, 00:22:51.808 "zcopy": true, 00:22:51.808 "get_zone_info": false, 00:22:51.808 "zone_management": false, 00:22:51.808 "zone_append": false, 00:22:51.808 "compare": false, 00:22:51.808 "compare_and_write": false, 00:22:51.808 "abort": true, 00:22:51.808 "seek_hole": false, 00:22:51.808 "seek_data": false, 00:22:51.808 "copy": true, 00:22:51.808 "nvme_iov_md": false 00:22:51.808 }, 00:22:51.808 "memory_domains": [ 00:22:51.808 { 00:22:51.808 "dma_device_id": "system", 00:22:51.808 "dma_device_type": 1 00:22:51.808 }, 00:22:51.808 { 00:22:51.808 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:51.808 "dma_device_type": 2 00:22:51.808 } 00:22:51.808 ], 00:22:51.808 "driver_specific": {} 00:22:51.808 } 00:22:51.808 ] 00:22:51.808 12:04:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:51.808 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:22:51.808 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:22:51.808 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:22:52.067 [2024-07-15 12:04:05.443517] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:22:52.067 [2024-07-15 12:04:05.443564] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:22:52.067 [2024-07-15 12:04:05.443583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:52.067 [2024-07-15 12:04:05.444951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:52.067 [2024-07-15 12:04:05.444994] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.067 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:52.326 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.326 "name": "Existed_Raid", 00:22:52.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.326 "strip_size_kb": 0, 00:22:52.326 "state": "configuring", 00:22:52.326 "raid_level": "raid1", 00:22:52.326 "superblock": false, 00:22:52.326 "num_base_bdevs": 4, 00:22:52.326 "num_base_bdevs_discovered": 3, 00:22:52.326 "num_base_bdevs_operational": 4, 00:22:52.326 "base_bdevs_list": [ 00:22:52.326 { 00:22:52.326 "name": "BaseBdev1", 00:22:52.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:52.326 "is_configured": false, 00:22:52.326 "data_offset": 0, 00:22:52.326 "data_size": 0 00:22:52.326 }, 00:22:52.326 { 00:22:52.326 "name": "BaseBdev2", 00:22:52.326 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:22:52.327 "is_configured": true, 00:22:52.327 "data_offset": 0, 00:22:52.327 "data_size": 65536 00:22:52.327 }, 00:22:52.327 { 00:22:52.327 "name": "BaseBdev3", 00:22:52.327 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:22:52.327 "is_configured": true, 00:22:52.327 "data_offset": 0, 00:22:52.327 "data_size": 65536 00:22:52.327 }, 00:22:52.327 { 00:22:52.327 "name": "BaseBdev4", 00:22:52.327 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:22:52.327 "is_configured": true, 00:22:52.327 "data_offset": 0, 00:22:52.327 "data_size": 65536 00:22:52.327 } 00:22:52.327 ] 00:22:52.327 }' 00:22:52.327 12:04:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.327 12:04:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:52.893 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:22:53.153 [2024-07-15 12:04:06.622624] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.153 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:53.412 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:53.412 "name": "Existed_Raid", 00:22:53.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:53.412 "strip_size_kb": 0, 00:22:53.412 "state": "configuring", 00:22:53.412 "raid_level": "raid1", 00:22:53.412 "superblock": false, 00:22:53.412 "num_base_bdevs": 4, 00:22:53.412 "num_base_bdevs_discovered": 2, 00:22:53.412 "num_base_bdevs_operational": 4, 00:22:53.412 "base_bdevs_list": [ 00:22:53.412 { 00:22:53.412 "name": "BaseBdev1", 00:22:53.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:53.412 "is_configured": false, 00:22:53.412 "data_offset": 0, 00:22:53.412 "data_size": 0 00:22:53.412 }, 00:22:53.412 { 00:22:53.412 "name": null, 00:22:53.412 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:22:53.412 "is_configured": false, 00:22:53.412 "data_offset": 0, 00:22:53.412 "data_size": 65536 00:22:53.412 }, 00:22:53.412 { 00:22:53.412 "name": "BaseBdev3", 00:22:53.412 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:22:53.412 "is_configured": true, 00:22:53.412 "data_offset": 0, 00:22:53.412 "data_size": 65536 00:22:53.412 }, 00:22:53.412 { 00:22:53.412 "name": "BaseBdev4", 00:22:53.412 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:22:53.412 "is_configured": true, 00:22:53.412 "data_offset": 0, 00:22:53.412 "data_size": 65536 00:22:53.412 } 00:22:53.412 ] 00:22:53.412 }' 00:22:53.412 12:04:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:53.412 12:04:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:53.980 12:04:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.980 12:04:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:22:54.239 12:04:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:22:54.239 12:04:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:22:54.498 [2024-07-15 12:04:07.974897] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:54.498 BaseBdev1 00:22:54.498 12:04:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:22:54.498 12:04:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:22:54.498 12:04:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:22:54.498 12:04:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:22:54.498 12:04:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:22:54.498 12:04:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:22:54.498 12:04:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:22:54.757 12:04:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:22:55.017 [ 00:22:55.017 { 00:22:55.017 "name": "BaseBdev1", 00:22:55.017 "aliases": [ 00:22:55.017 "b4004eba-1133-461f-b84b-1f93a32254f3" 00:22:55.017 ], 00:22:55.017 "product_name": "Malloc disk", 00:22:55.017 "block_size": 512, 00:22:55.017 "num_blocks": 65536, 00:22:55.017 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:22:55.017 "assigned_rate_limits": { 00:22:55.017 "rw_ios_per_sec": 0, 00:22:55.017 "rw_mbytes_per_sec": 0, 00:22:55.017 "r_mbytes_per_sec": 0, 00:22:55.017 "w_mbytes_per_sec": 0 00:22:55.017 }, 00:22:55.017 "claimed": true, 00:22:55.017 "claim_type": "exclusive_write", 00:22:55.017 "zoned": false, 00:22:55.017 "supported_io_types": { 00:22:55.017 "read": true, 00:22:55.017 "write": true, 00:22:55.017 "unmap": true, 00:22:55.017 "flush": true, 00:22:55.017 "reset": true, 00:22:55.017 "nvme_admin": false, 00:22:55.017 "nvme_io": false, 00:22:55.017 "nvme_io_md": false, 00:22:55.017 "write_zeroes": true, 00:22:55.017 "zcopy": true, 00:22:55.017 "get_zone_info": false, 00:22:55.017 "zone_management": false, 00:22:55.017 "zone_append": false, 00:22:55.017 "compare": false, 00:22:55.017 "compare_and_write": false, 00:22:55.017 "abort": true, 00:22:55.017 "seek_hole": false, 00:22:55.017 "seek_data": false, 00:22:55.017 "copy": true, 00:22:55.017 "nvme_iov_md": false 00:22:55.017 }, 00:22:55.017 "memory_domains": [ 00:22:55.017 { 00:22:55.017 "dma_device_id": "system", 00:22:55.017 "dma_device_type": 1 00:22:55.017 }, 00:22:55.017 { 00:22:55.017 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:22:55.017 "dma_device_type": 2 00:22:55.017 } 00:22:55.017 ], 00:22:55.017 "driver_specific": {} 00:22:55.017 } 00:22:55.017 ] 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.017 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:55.276 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:55.276 "name": "Existed_Raid", 00:22:55.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:55.276 "strip_size_kb": 0, 00:22:55.276 "state": "configuring", 00:22:55.276 "raid_level": "raid1", 00:22:55.276 "superblock": false, 00:22:55.276 "num_base_bdevs": 4, 00:22:55.276 "num_base_bdevs_discovered": 3, 00:22:55.276 "num_base_bdevs_operational": 4, 00:22:55.276 "base_bdevs_list": [ 00:22:55.276 { 00:22:55.276 "name": "BaseBdev1", 00:22:55.276 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:22:55.276 "is_configured": true, 00:22:55.276 "data_offset": 0, 00:22:55.276 "data_size": 65536 00:22:55.276 }, 00:22:55.276 { 00:22:55.276 "name": null, 00:22:55.276 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:22:55.276 "is_configured": false, 00:22:55.276 "data_offset": 0, 00:22:55.276 "data_size": 65536 00:22:55.276 }, 00:22:55.276 { 00:22:55.276 "name": "BaseBdev3", 00:22:55.276 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:22:55.276 "is_configured": true, 00:22:55.276 "data_offset": 0, 00:22:55.276 "data_size": 65536 00:22:55.276 }, 00:22:55.276 { 00:22:55.276 "name": "BaseBdev4", 00:22:55.276 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:22:55.276 "is_configured": true, 00:22:55.276 "data_offset": 0, 00:22:55.276 "data_size": 65536 00:22:55.276 } 00:22:55.276 ] 00:22:55.276 }' 00:22:55.276 12:04:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:55.276 12:04:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:55.845 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.845 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:22:56.105 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:22:56.105 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:22:56.365 [2024-07-15 12:04:09.747645] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.365 12:04:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:56.624 12:04:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.624 "name": "Existed_Raid", 00:22:56.624 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.624 "strip_size_kb": 0, 00:22:56.624 "state": "configuring", 00:22:56.624 "raid_level": "raid1", 00:22:56.624 "superblock": false, 00:22:56.624 "num_base_bdevs": 4, 00:22:56.624 "num_base_bdevs_discovered": 2, 00:22:56.624 "num_base_bdevs_operational": 4, 00:22:56.624 "base_bdevs_list": [ 00:22:56.624 { 00:22:56.624 "name": "BaseBdev1", 00:22:56.624 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:22:56.624 "is_configured": true, 00:22:56.624 "data_offset": 0, 00:22:56.624 "data_size": 65536 00:22:56.624 }, 00:22:56.624 { 00:22:56.624 "name": null, 00:22:56.624 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:22:56.624 "is_configured": false, 00:22:56.624 "data_offset": 0, 00:22:56.624 "data_size": 65536 00:22:56.624 }, 00:22:56.624 { 00:22:56.624 "name": null, 00:22:56.624 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:22:56.624 "is_configured": false, 00:22:56.624 "data_offset": 0, 00:22:56.624 "data_size": 65536 00:22:56.624 }, 00:22:56.624 { 00:22:56.624 "name": "BaseBdev4", 00:22:56.624 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:22:56.624 "is_configured": true, 00:22:56.624 "data_offset": 0, 00:22:56.624 "data_size": 65536 00:22:56.624 } 00:22:56.624 ] 00:22:56.624 }' 00:22:56.624 12:04:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.624 12:04:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:57.193 12:04:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.193 12:04:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:57.452 12:04:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:22:57.452 12:04:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:22:57.711 [2024-07-15 12:04:11.127499] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.711 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:57.993 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:57.993 "name": "Existed_Raid", 00:22:57.993 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:57.993 "strip_size_kb": 0, 00:22:57.993 "state": "configuring", 00:22:57.993 "raid_level": "raid1", 00:22:57.994 "superblock": false, 00:22:57.994 "num_base_bdevs": 4, 00:22:57.994 "num_base_bdevs_discovered": 3, 00:22:57.994 "num_base_bdevs_operational": 4, 00:22:57.994 "base_bdevs_list": [ 00:22:57.994 { 00:22:57.994 "name": "BaseBdev1", 00:22:57.994 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:22:57.994 "is_configured": true, 00:22:57.994 "data_offset": 0, 00:22:57.994 "data_size": 65536 00:22:57.994 }, 00:22:57.994 { 00:22:57.994 "name": null, 00:22:57.994 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:22:57.994 "is_configured": false, 00:22:57.994 "data_offset": 0, 00:22:57.994 "data_size": 65536 00:22:57.994 }, 00:22:57.994 { 00:22:57.994 "name": "BaseBdev3", 00:22:57.994 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:22:57.994 "is_configured": true, 00:22:57.994 "data_offset": 0, 00:22:57.994 "data_size": 65536 00:22:57.994 }, 00:22:57.994 { 00:22:57.994 "name": "BaseBdev4", 00:22:57.994 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:22:57.994 "is_configured": true, 00:22:57.994 "data_offset": 0, 00:22:57.994 "data_size": 65536 00:22:57.994 } 00:22:57.994 ] 00:22:57.994 }' 00:22:57.994 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:57.994 12:04:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:58.562 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:22:58.562 12:04:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.833 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:22:58.833 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:22:59.132 [2024-07-15 12:04:12.491138] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.133 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:22:59.408 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:59.408 "name": "Existed_Raid", 00:22:59.408 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.408 "strip_size_kb": 0, 00:22:59.408 "state": "configuring", 00:22:59.408 "raid_level": "raid1", 00:22:59.408 "superblock": false, 00:22:59.408 "num_base_bdevs": 4, 00:22:59.408 "num_base_bdevs_discovered": 2, 00:22:59.408 "num_base_bdevs_operational": 4, 00:22:59.408 "base_bdevs_list": [ 00:22:59.408 { 00:22:59.408 "name": null, 00:22:59.408 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:22:59.408 "is_configured": false, 00:22:59.408 "data_offset": 0, 00:22:59.408 "data_size": 65536 00:22:59.408 }, 00:22:59.408 { 00:22:59.408 "name": null, 00:22:59.408 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:22:59.408 "is_configured": false, 00:22:59.408 "data_offset": 0, 00:22:59.408 "data_size": 65536 00:22:59.408 }, 00:22:59.408 { 00:22:59.408 "name": "BaseBdev3", 00:22:59.408 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:22:59.408 "is_configured": true, 00:22:59.408 "data_offset": 0, 00:22:59.408 "data_size": 65536 00:22:59.408 }, 00:22:59.408 { 00:22:59.408 "name": "BaseBdev4", 00:22:59.408 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:22:59.408 "is_configured": true, 00:22:59.408 "data_offset": 0, 00:22:59.408 "data_size": 65536 00:22:59.408 } 00:22:59.408 ] 00:22:59.408 }' 00:22:59.408 12:04:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:59.408 12:04:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:22:59.977 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.977 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:00.237 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:23:00.237 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:23:00.497 [2024-07-15 12:04:13.872121] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:00.497 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:00.497 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:00.497 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:00.497 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:00.497 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:00.498 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:00.498 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:00.498 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:00.498 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:00.498 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:00.498 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:00.498 12:04:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:00.758 12:04:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:00.758 "name": "Existed_Raid", 00:23:00.758 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:00.758 "strip_size_kb": 0, 00:23:00.758 "state": "configuring", 00:23:00.758 "raid_level": "raid1", 00:23:00.758 "superblock": false, 00:23:00.758 "num_base_bdevs": 4, 00:23:00.758 "num_base_bdevs_discovered": 3, 00:23:00.758 "num_base_bdevs_operational": 4, 00:23:00.758 "base_bdevs_list": [ 00:23:00.758 { 00:23:00.758 "name": null, 00:23:00.758 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:23:00.758 "is_configured": false, 00:23:00.758 "data_offset": 0, 00:23:00.758 "data_size": 65536 00:23:00.758 }, 00:23:00.758 { 00:23:00.758 "name": "BaseBdev2", 00:23:00.758 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:23:00.758 "is_configured": true, 00:23:00.758 "data_offset": 0, 00:23:00.758 "data_size": 65536 00:23:00.758 }, 00:23:00.758 { 00:23:00.758 "name": "BaseBdev3", 00:23:00.758 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:23:00.758 "is_configured": true, 00:23:00.758 "data_offset": 0, 00:23:00.758 "data_size": 65536 00:23:00.758 }, 00:23:00.758 { 00:23:00.758 "name": "BaseBdev4", 00:23:00.758 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:23:00.758 "is_configured": true, 00:23:00.758 "data_offset": 0, 00:23:00.758 "data_size": 65536 00:23:00.758 } 00:23:00.758 ] 00:23:00.758 }' 00:23:00.758 12:04:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:00.758 12:04:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:01.329 12:04:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.329 12:04:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:01.590 12:04:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:23:01.590 12:04:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.590 12:04:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:23:01.850 12:04:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b4004eba-1133-461f-b84b-1f93a32254f3 00:23:02.109 [2024-07-15 12:04:15.561472] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:23:02.109 [2024-07-15 12:04:15.561515] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd57120 00:23:02.109 [2024-07-15 12:04:15.561524] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:02.109 [2024-07-15 12:04:15.561760] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd55af0 00:23:02.109 [2024-07-15 12:04:15.561905] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd57120 00:23:02.109 [2024-07-15 12:04:15.561915] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd57120 00:23:02.109 [2024-07-15 12:04:15.562094] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:02.109 NewBaseBdev 00:23:02.109 12:04:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:23:02.109 12:04:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:23:02.109 12:04:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:02.109 12:04:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local i 00:23:02.109 12:04:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:02.109 12:04:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:02.109 12:04:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:02.370 12:04:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:23:02.630 [ 00:23:02.630 { 00:23:02.630 "name": "NewBaseBdev", 00:23:02.630 "aliases": [ 00:23:02.630 "b4004eba-1133-461f-b84b-1f93a32254f3" 00:23:02.630 ], 00:23:02.630 "product_name": "Malloc disk", 00:23:02.630 "block_size": 512, 00:23:02.630 "num_blocks": 65536, 00:23:02.630 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:23:02.630 "assigned_rate_limits": { 00:23:02.630 "rw_ios_per_sec": 0, 00:23:02.630 "rw_mbytes_per_sec": 0, 00:23:02.630 "r_mbytes_per_sec": 0, 00:23:02.630 "w_mbytes_per_sec": 0 00:23:02.630 }, 00:23:02.630 "claimed": true, 00:23:02.630 "claim_type": "exclusive_write", 00:23:02.630 "zoned": false, 00:23:02.630 "supported_io_types": { 00:23:02.630 "read": true, 00:23:02.630 "write": true, 00:23:02.630 "unmap": true, 00:23:02.630 "flush": true, 00:23:02.630 "reset": true, 00:23:02.630 "nvme_admin": false, 00:23:02.630 "nvme_io": false, 00:23:02.630 "nvme_io_md": false, 00:23:02.630 "write_zeroes": true, 00:23:02.630 "zcopy": true, 00:23:02.630 "get_zone_info": false, 00:23:02.630 "zone_management": false, 00:23:02.630 "zone_append": false, 00:23:02.630 "compare": false, 00:23:02.630 "compare_and_write": false, 00:23:02.630 "abort": true, 00:23:02.630 "seek_hole": false, 00:23:02.630 "seek_data": false, 00:23:02.630 "copy": true, 00:23:02.630 "nvme_iov_md": false 00:23:02.630 }, 00:23:02.630 "memory_domains": [ 00:23:02.630 { 00:23:02.630 "dma_device_id": "system", 00:23:02.630 "dma_device_type": 1 00:23:02.630 }, 00:23:02.630 { 00:23:02.630 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:02.630 "dma_device_type": 2 00:23:02.630 } 00:23:02.630 ], 00:23:02.630 "driver_specific": {} 00:23:02.630 } 00:23:02.630 ] 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # return 0 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.630 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:02.890 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.890 "name": "Existed_Raid", 00:23:02.890 "uuid": "4e83514e-2305-469a-984f-1e26ccc2bea8", 00:23:02.890 "strip_size_kb": 0, 00:23:02.890 "state": "online", 00:23:02.890 "raid_level": "raid1", 00:23:02.890 "superblock": false, 00:23:02.890 "num_base_bdevs": 4, 00:23:02.890 "num_base_bdevs_discovered": 4, 00:23:02.890 "num_base_bdevs_operational": 4, 00:23:02.890 "base_bdevs_list": [ 00:23:02.890 { 00:23:02.890 "name": "NewBaseBdev", 00:23:02.890 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:23:02.890 "is_configured": true, 00:23:02.890 "data_offset": 0, 00:23:02.890 "data_size": 65536 00:23:02.890 }, 00:23:02.890 { 00:23:02.890 "name": "BaseBdev2", 00:23:02.890 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:23:02.890 "is_configured": true, 00:23:02.890 "data_offset": 0, 00:23:02.890 "data_size": 65536 00:23:02.890 }, 00:23:02.890 { 00:23:02.890 "name": "BaseBdev3", 00:23:02.890 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:23:02.890 "is_configured": true, 00:23:02.890 "data_offset": 0, 00:23:02.890 "data_size": 65536 00:23:02.890 }, 00:23:02.890 { 00:23:02.890 "name": "BaseBdev4", 00:23:02.890 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:23:02.891 "is_configured": true, 00:23:02.891 "data_offset": 0, 00:23:02.891 "data_size": 65536 00:23:02.891 } 00:23:02.891 ] 00:23:02.891 }' 00:23:02.891 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.891 12:04:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:03.458 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:23:03.458 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:03.458 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:03.458 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:03.458 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:03.458 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:03.458 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:03.458 12:04:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:03.718 [2024-07-15 12:04:17.145994] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:03.718 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:03.718 "name": "Existed_Raid", 00:23:03.718 "aliases": [ 00:23:03.718 "4e83514e-2305-469a-984f-1e26ccc2bea8" 00:23:03.718 ], 00:23:03.718 "product_name": "Raid Volume", 00:23:03.718 "block_size": 512, 00:23:03.718 "num_blocks": 65536, 00:23:03.718 "uuid": "4e83514e-2305-469a-984f-1e26ccc2bea8", 00:23:03.718 "assigned_rate_limits": { 00:23:03.718 "rw_ios_per_sec": 0, 00:23:03.718 "rw_mbytes_per_sec": 0, 00:23:03.718 "r_mbytes_per_sec": 0, 00:23:03.718 "w_mbytes_per_sec": 0 00:23:03.718 }, 00:23:03.718 "claimed": false, 00:23:03.718 "zoned": false, 00:23:03.718 "supported_io_types": { 00:23:03.718 "read": true, 00:23:03.718 "write": true, 00:23:03.718 "unmap": false, 00:23:03.718 "flush": false, 00:23:03.718 "reset": true, 00:23:03.718 "nvme_admin": false, 00:23:03.718 "nvme_io": false, 00:23:03.718 "nvme_io_md": false, 00:23:03.718 "write_zeroes": true, 00:23:03.718 "zcopy": false, 00:23:03.718 "get_zone_info": false, 00:23:03.718 "zone_management": false, 00:23:03.718 "zone_append": false, 00:23:03.718 "compare": false, 00:23:03.718 "compare_and_write": false, 00:23:03.718 "abort": false, 00:23:03.718 "seek_hole": false, 00:23:03.718 "seek_data": false, 00:23:03.718 "copy": false, 00:23:03.718 "nvme_iov_md": false 00:23:03.718 }, 00:23:03.718 "memory_domains": [ 00:23:03.718 { 00:23:03.718 "dma_device_id": "system", 00:23:03.718 "dma_device_type": 1 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.718 "dma_device_type": 2 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "dma_device_id": "system", 00:23:03.718 "dma_device_type": 1 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.718 "dma_device_type": 2 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "dma_device_id": "system", 00:23:03.718 "dma_device_type": 1 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.718 "dma_device_type": 2 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "dma_device_id": "system", 00:23:03.718 "dma_device_type": 1 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.718 "dma_device_type": 2 00:23:03.718 } 00:23:03.718 ], 00:23:03.718 "driver_specific": { 00:23:03.718 "raid": { 00:23:03.718 "uuid": "4e83514e-2305-469a-984f-1e26ccc2bea8", 00:23:03.718 "strip_size_kb": 0, 00:23:03.718 "state": "online", 00:23:03.718 "raid_level": "raid1", 00:23:03.718 "superblock": false, 00:23:03.718 "num_base_bdevs": 4, 00:23:03.718 "num_base_bdevs_discovered": 4, 00:23:03.718 "num_base_bdevs_operational": 4, 00:23:03.718 "base_bdevs_list": [ 00:23:03.718 { 00:23:03.718 "name": "NewBaseBdev", 00:23:03.718 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:23:03.718 "is_configured": true, 00:23:03.718 "data_offset": 0, 00:23:03.718 "data_size": 65536 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "name": "BaseBdev2", 00:23:03.718 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:23:03.718 "is_configured": true, 00:23:03.718 "data_offset": 0, 00:23:03.718 "data_size": 65536 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "name": "BaseBdev3", 00:23:03.718 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:23:03.718 "is_configured": true, 00:23:03.718 "data_offset": 0, 00:23:03.718 "data_size": 65536 00:23:03.718 }, 00:23:03.718 { 00:23:03.718 "name": "BaseBdev4", 00:23:03.718 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:23:03.718 "is_configured": true, 00:23:03.718 "data_offset": 0, 00:23:03.718 "data_size": 65536 00:23:03.718 } 00:23:03.718 ] 00:23:03.718 } 00:23:03.718 } 00:23:03.718 }' 00:23:03.718 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:03.718 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:23:03.718 BaseBdev2 00:23:03.718 BaseBdev3 00:23:03.718 BaseBdev4' 00:23:03.718 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:03.718 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:23:03.718 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:03.978 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:03.978 "name": "NewBaseBdev", 00:23:03.978 "aliases": [ 00:23:03.978 "b4004eba-1133-461f-b84b-1f93a32254f3" 00:23:03.978 ], 00:23:03.978 "product_name": "Malloc disk", 00:23:03.978 "block_size": 512, 00:23:03.978 "num_blocks": 65536, 00:23:03.978 "uuid": "b4004eba-1133-461f-b84b-1f93a32254f3", 00:23:03.978 "assigned_rate_limits": { 00:23:03.978 "rw_ios_per_sec": 0, 00:23:03.978 "rw_mbytes_per_sec": 0, 00:23:03.978 "r_mbytes_per_sec": 0, 00:23:03.978 "w_mbytes_per_sec": 0 00:23:03.978 }, 00:23:03.978 "claimed": true, 00:23:03.978 "claim_type": "exclusive_write", 00:23:03.978 "zoned": false, 00:23:03.978 "supported_io_types": { 00:23:03.978 "read": true, 00:23:03.978 "write": true, 00:23:03.978 "unmap": true, 00:23:03.978 "flush": true, 00:23:03.978 "reset": true, 00:23:03.978 "nvme_admin": false, 00:23:03.978 "nvme_io": false, 00:23:03.978 "nvme_io_md": false, 00:23:03.978 "write_zeroes": true, 00:23:03.978 "zcopy": true, 00:23:03.978 "get_zone_info": false, 00:23:03.978 "zone_management": false, 00:23:03.978 "zone_append": false, 00:23:03.978 "compare": false, 00:23:03.978 "compare_and_write": false, 00:23:03.978 "abort": true, 00:23:03.978 "seek_hole": false, 00:23:03.978 "seek_data": false, 00:23:03.978 "copy": true, 00:23:03.978 "nvme_iov_md": false 00:23:03.978 }, 00:23:03.978 "memory_domains": [ 00:23:03.978 { 00:23:03.978 "dma_device_id": "system", 00:23:03.978 "dma_device_type": 1 00:23:03.978 }, 00:23:03.978 { 00:23:03.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:03.978 "dma_device_type": 2 00:23:03.978 } 00:23:03.978 ], 00:23:03.978 "driver_specific": {} 00:23:03.978 }' 00:23:03.978 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:03.978 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:04.237 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:04.237 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.237 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.237 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:04.237 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:04.237 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:04.237 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:04.237 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:04.237 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:04.497 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:04.497 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:04.497 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:04.497 12:04:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:04.756 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:04.756 "name": "BaseBdev2", 00:23:04.756 "aliases": [ 00:23:04.756 "8b67da11-a265-4306-8233-ecb64804f9f3" 00:23:04.756 ], 00:23:04.756 "product_name": "Malloc disk", 00:23:04.756 "block_size": 512, 00:23:04.756 "num_blocks": 65536, 00:23:04.756 "uuid": "8b67da11-a265-4306-8233-ecb64804f9f3", 00:23:04.756 "assigned_rate_limits": { 00:23:04.756 "rw_ios_per_sec": 0, 00:23:04.756 "rw_mbytes_per_sec": 0, 00:23:04.756 "r_mbytes_per_sec": 0, 00:23:04.756 "w_mbytes_per_sec": 0 00:23:04.756 }, 00:23:04.756 "claimed": true, 00:23:04.756 "claim_type": "exclusive_write", 00:23:04.756 "zoned": false, 00:23:04.756 "supported_io_types": { 00:23:04.756 "read": true, 00:23:04.756 "write": true, 00:23:04.756 "unmap": true, 00:23:04.756 "flush": true, 00:23:04.756 "reset": true, 00:23:04.756 "nvme_admin": false, 00:23:04.756 "nvme_io": false, 00:23:04.756 "nvme_io_md": false, 00:23:04.756 "write_zeroes": true, 00:23:04.756 "zcopy": true, 00:23:04.756 "get_zone_info": false, 00:23:04.756 "zone_management": false, 00:23:04.756 "zone_append": false, 00:23:04.756 "compare": false, 00:23:04.757 "compare_and_write": false, 00:23:04.757 "abort": true, 00:23:04.757 "seek_hole": false, 00:23:04.757 "seek_data": false, 00:23:04.757 "copy": true, 00:23:04.757 "nvme_iov_md": false 00:23:04.757 }, 00:23:04.757 "memory_domains": [ 00:23:04.757 { 00:23:04.757 "dma_device_id": "system", 00:23:04.757 "dma_device_type": 1 00:23:04.757 }, 00:23:04.757 { 00:23:04.757 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:04.757 "dma_device_type": 2 00:23:04.757 } 00:23:04.757 ], 00:23:04.757 "driver_specific": {} 00:23:04.757 }' 00:23:04.757 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:04.757 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:04.757 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:04.757 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.757 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:04.757 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:04.757 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:04.757 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.016 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:05.016 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.016 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.016 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:05.016 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:05.016 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:05.016 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:05.275 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:05.275 "name": "BaseBdev3", 00:23:05.275 "aliases": [ 00:23:05.275 "170c969e-a8bb-46cf-bd6d-6faac286de97" 00:23:05.275 ], 00:23:05.275 "product_name": "Malloc disk", 00:23:05.275 "block_size": 512, 00:23:05.275 "num_blocks": 65536, 00:23:05.275 "uuid": "170c969e-a8bb-46cf-bd6d-6faac286de97", 00:23:05.275 "assigned_rate_limits": { 00:23:05.275 "rw_ios_per_sec": 0, 00:23:05.275 "rw_mbytes_per_sec": 0, 00:23:05.275 "r_mbytes_per_sec": 0, 00:23:05.275 "w_mbytes_per_sec": 0 00:23:05.275 }, 00:23:05.275 "claimed": true, 00:23:05.275 "claim_type": "exclusive_write", 00:23:05.275 "zoned": false, 00:23:05.275 "supported_io_types": { 00:23:05.275 "read": true, 00:23:05.275 "write": true, 00:23:05.275 "unmap": true, 00:23:05.275 "flush": true, 00:23:05.275 "reset": true, 00:23:05.275 "nvme_admin": false, 00:23:05.275 "nvme_io": false, 00:23:05.275 "nvme_io_md": false, 00:23:05.275 "write_zeroes": true, 00:23:05.275 "zcopy": true, 00:23:05.275 "get_zone_info": false, 00:23:05.275 "zone_management": false, 00:23:05.275 "zone_append": false, 00:23:05.275 "compare": false, 00:23:05.275 "compare_and_write": false, 00:23:05.275 "abort": true, 00:23:05.275 "seek_hole": false, 00:23:05.275 "seek_data": false, 00:23:05.275 "copy": true, 00:23:05.275 "nvme_iov_md": false 00:23:05.275 }, 00:23:05.275 "memory_domains": [ 00:23:05.275 { 00:23:05.275 "dma_device_id": "system", 00:23:05.275 "dma_device_type": 1 00:23:05.275 }, 00:23:05.275 { 00:23:05.275 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.275 "dma_device_type": 2 00:23:05.275 } 00:23:05.275 ], 00:23:05.276 "driver_specific": {} 00:23:05.276 }' 00:23:05.276 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.276 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.276 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:05.276 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.276 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:05.535 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:05.535 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.535 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:05.535 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:05.535 12:04:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.535 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:05.535 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:05.535 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:05.535 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:05.535 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:05.794 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:05.794 "name": "BaseBdev4", 00:23:05.794 "aliases": [ 00:23:05.794 "5407c2e5-a785-46f1-8ced-68f2fd675d7f" 00:23:05.794 ], 00:23:05.794 "product_name": "Malloc disk", 00:23:05.794 "block_size": 512, 00:23:05.794 "num_blocks": 65536, 00:23:05.794 "uuid": "5407c2e5-a785-46f1-8ced-68f2fd675d7f", 00:23:05.794 "assigned_rate_limits": { 00:23:05.794 "rw_ios_per_sec": 0, 00:23:05.794 "rw_mbytes_per_sec": 0, 00:23:05.794 "r_mbytes_per_sec": 0, 00:23:05.794 "w_mbytes_per_sec": 0 00:23:05.794 }, 00:23:05.794 "claimed": true, 00:23:05.794 "claim_type": "exclusive_write", 00:23:05.794 "zoned": false, 00:23:05.794 "supported_io_types": { 00:23:05.794 "read": true, 00:23:05.794 "write": true, 00:23:05.794 "unmap": true, 00:23:05.794 "flush": true, 00:23:05.794 "reset": true, 00:23:05.794 "nvme_admin": false, 00:23:05.794 "nvme_io": false, 00:23:05.794 "nvme_io_md": false, 00:23:05.794 "write_zeroes": true, 00:23:05.794 "zcopy": true, 00:23:05.794 "get_zone_info": false, 00:23:05.794 "zone_management": false, 00:23:05.794 "zone_append": false, 00:23:05.794 "compare": false, 00:23:05.794 "compare_and_write": false, 00:23:05.794 "abort": true, 00:23:05.794 "seek_hole": false, 00:23:05.794 "seek_data": false, 00:23:05.794 "copy": true, 00:23:05.794 "nvme_iov_md": false 00:23:05.794 }, 00:23:05.794 "memory_domains": [ 00:23:05.794 { 00:23:05.794 "dma_device_id": "system", 00:23:05.794 "dma_device_type": 1 00:23:05.794 }, 00:23:05.794 { 00:23:05.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:05.794 "dma_device_type": 2 00:23:05.794 } 00:23:05.794 ], 00:23:05.794 "driver_specific": {} 00:23:05.794 }' 00:23:05.794 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.794 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:05.794 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:05.794 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.053 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:06.053 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:06.053 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.053 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:06.053 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:06.053 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.053 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:06.053 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:06.053 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:06.314 [2024-07-15 12:04:19.868910] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:06.314 [2024-07-15 12:04:19.868936] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:06.314 [2024-07-15 12:04:19.868990] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:06.314 [2024-07-15 12:04:19.869292] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:06.314 [2024-07-15 12:04:19.869305] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd57120 name Existed_Raid, state offline 00:23:06.314 12:04:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 1547270 00:23:06.314 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@948 -- # '[' -z 1547270 ']' 00:23:06.314 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@952 -- # kill -0 1547270 00:23:06.314 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # uname 00:23:06.314 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:06.314 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1547270 00:23:06.574 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:06.574 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:06.574 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1547270' 00:23:06.574 killing process with pid 1547270 00:23:06.574 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # kill 1547270 00:23:06.574 [2024-07-15 12:04:19.936383] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:06.574 12:04:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@972 -- # wait 1547270 00:23:06.574 [2024-07-15 12:04:20.024596] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:06.833 12:04:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:23:06.834 00:23:06.834 real 0m34.034s 00:23:06.834 user 1m2.333s 00:23:06.834 sys 0m6.046s 00:23:06.834 12:04:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:06.834 12:04:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:23:06.834 ************************************ 00:23:06.834 END TEST raid_state_function_test 00:23:06.834 ************************************ 00:23:07.094 12:04:20 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:07.094 12:04:20 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:23:07.094 12:04:20 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:23:07.094 12:04:20 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:07.094 12:04:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:07.094 ************************************ 00:23:07.094 START TEST raid_state_function_test_sb 00:23:07.094 ************************************ 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 4 true 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=1552311 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1552311' 00:23:07.094 Process raid pid: 1552311 00:23:07.094 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:23:07.095 12:04:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 1552311 /var/tmp/spdk-raid.sock 00:23:07.095 12:04:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1552311 ']' 00:23:07.095 12:04:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:07.095 12:04:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:07.095 12:04:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:07.095 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:07.095 12:04:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:07.095 12:04:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:07.095 [2024-07-15 12:04:20.559772] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:23:07.095 [2024-07-15 12:04:20.559837] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:23:07.095 [2024-07-15 12:04:20.689527] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:07.355 [2024-07-15 12:04:20.786277] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:07.355 [2024-07-15 12:04:20.850696] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:07.355 [2024-07-15 12:04:20.850725] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:07.924 12:04:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:07.924 12:04:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@862 -- # return 0 00:23:07.924 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:08.183 [2024-07-15 12:04:21.729610] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:08.183 [2024-07-15 12:04:21.729658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:08.183 [2024-07-15 12:04:21.729668] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:08.183 [2024-07-15 12:04:21.729680] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:08.183 [2024-07-15 12:04:21.729700] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:08.183 [2024-07-15 12:04:21.729718] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:08.183 [2024-07-15 12:04:21.729731] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:08.183 [2024-07-15 12:04:21.729742] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.183 12:04:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:08.442 12:04:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.442 "name": "Existed_Raid", 00:23:08.442 "uuid": "f02cf96b-0ce0-44c8-858d-fb846a0e45f4", 00:23:08.442 "strip_size_kb": 0, 00:23:08.442 "state": "configuring", 00:23:08.442 "raid_level": "raid1", 00:23:08.442 "superblock": true, 00:23:08.442 "num_base_bdevs": 4, 00:23:08.442 "num_base_bdevs_discovered": 0, 00:23:08.442 "num_base_bdevs_operational": 4, 00:23:08.442 "base_bdevs_list": [ 00:23:08.442 { 00:23:08.442 "name": "BaseBdev1", 00:23:08.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.442 "is_configured": false, 00:23:08.442 "data_offset": 0, 00:23:08.442 "data_size": 0 00:23:08.442 }, 00:23:08.442 { 00:23:08.442 "name": "BaseBdev2", 00:23:08.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.442 "is_configured": false, 00:23:08.442 "data_offset": 0, 00:23:08.442 "data_size": 0 00:23:08.442 }, 00:23:08.442 { 00:23:08.442 "name": "BaseBdev3", 00:23:08.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.442 "is_configured": false, 00:23:08.442 "data_offset": 0, 00:23:08.442 "data_size": 0 00:23:08.442 }, 00:23:08.442 { 00:23:08.442 "name": "BaseBdev4", 00:23:08.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.442 "is_configured": false, 00:23:08.442 "data_offset": 0, 00:23:08.442 "data_size": 0 00:23:08.442 } 00:23:08.442 ] 00:23:08.442 }' 00:23:08.442 12:04:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.442 12:04:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:09.010 12:04:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:09.269 [2024-07-15 12:04:22.740136] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:09.269 [2024-07-15 12:04:22.740170] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c3ab20 name Existed_Raid, state configuring 00:23:09.269 12:04:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:09.529 [2024-07-15 12:04:22.988809] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:09.529 [2024-07-15 12:04:22.988839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:09.529 [2024-07-15 12:04:22.988849] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:09.529 [2024-07-15 12:04:22.988860] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:09.529 [2024-07-15 12:04:22.988869] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:09.529 [2024-07-15 12:04:22.988880] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:09.529 [2024-07-15 12:04:22.988888] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:09.529 [2024-07-15 12:04:22.988899] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:09.529 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:09.788 [2024-07-15 12:04:23.183224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:09.788 BaseBdev1 00:23:09.788 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:23:09.788 12:04:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:23:09.788 12:04:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:09.788 12:04:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:09.789 12:04:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:09.789 12:04:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:09.789 12:04:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:10.047 12:04:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:10.307 [ 00:23:10.307 { 00:23:10.307 "name": "BaseBdev1", 00:23:10.307 "aliases": [ 00:23:10.307 "306f193a-b110-46d9-82ee-09b8f01b7381" 00:23:10.307 ], 00:23:10.307 "product_name": "Malloc disk", 00:23:10.307 "block_size": 512, 00:23:10.307 "num_blocks": 65536, 00:23:10.307 "uuid": "306f193a-b110-46d9-82ee-09b8f01b7381", 00:23:10.307 "assigned_rate_limits": { 00:23:10.307 "rw_ios_per_sec": 0, 00:23:10.307 "rw_mbytes_per_sec": 0, 00:23:10.307 "r_mbytes_per_sec": 0, 00:23:10.307 "w_mbytes_per_sec": 0 00:23:10.307 }, 00:23:10.307 "claimed": true, 00:23:10.307 "claim_type": "exclusive_write", 00:23:10.307 "zoned": false, 00:23:10.307 "supported_io_types": { 00:23:10.307 "read": true, 00:23:10.307 "write": true, 00:23:10.307 "unmap": true, 00:23:10.307 "flush": true, 00:23:10.307 "reset": true, 00:23:10.307 "nvme_admin": false, 00:23:10.307 "nvme_io": false, 00:23:10.307 "nvme_io_md": false, 00:23:10.307 "write_zeroes": true, 00:23:10.307 "zcopy": true, 00:23:10.307 "get_zone_info": false, 00:23:10.307 "zone_management": false, 00:23:10.307 "zone_append": false, 00:23:10.307 "compare": false, 00:23:10.307 "compare_and_write": false, 00:23:10.307 "abort": true, 00:23:10.307 "seek_hole": false, 00:23:10.307 "seek_data": false, 00:23:10.307 "copy": true, 00:23:10.307 "nvme_iov_md": false 00:23:10.307 }, 00:23:10.307 "memory_domains": [ 00:23:10.307 { 00:23:10.307 "dma_device_id": "system", 00:23:10.307 "dma_device_type": 1 00:23:10.307 }, 00:23:10.307 { 00:23:10.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:10.307 "dma_device_type": 2 00:23:10.307 } 00:23:10.307 ], 00:23:10.307 "driver_specific": {} 00:23:10.307 } 00:23:10.307 ] 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.307 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:10.566 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:10.566 "name": "Existed_Raid", 00:23:10.566 "uuid": "72895ee9-c8d4-4aed-a682-66ec6368aecb", 00:23:10.566 "strip_size_kb": 0, 00:23:10.566 "state": "configuring", 00:23:10.566 "raid_level": "raid1", 00:23:10.566 "superblock": true, 00:23:10.566 "num_base_bdevs": 4, 00:23:10.566 "num_base_bdevs_discovered": 1, 00:23:10.566 "num_base_bdevs_operational": 4, 00:23:10.566 "base_bdevs_list": [ 00:23:10.566 { 00:23:10.566 "name": "BaseBdev1", 00:23:10.566 "uuid": "306f193a-b110-46d9-82ee-09b8f01b7381", 00:23:10.566 "is_configured": true, 00:23:10.566 "data_offset": 2048, 00:23:10.566 "data_size": 63488 00:23:10.567 }, 00:23:10.567 { 00:23:10.567 "name": "BaseBdev2", 00:23:10.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.567 "is_configured": false, 00:23:10.567 "data_offset": 0, 00:23:10.567 "data_size": 0 00:23:10.567 }, 00:23:10.567 { 00:23:10.567 "name": "BaseBdev3", 00:23:10.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.567 "is_configured": false, 00:23:10.567 "data_offset": 0, 00:23:10.567 "data_size": 0 00:23:10.567 }, 00:23:10.567 { 00:23:10.567 "name": "BaseBdev4", 00:23:10.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.567 "is_configured": false, 00:23:10.567 "data_offset": 0, 00:23:10.567 "data_size": 0 00:23:10.567 } 00:23:10.567 ] 00:23:10.567 }' 00:23:10.567 12:04:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:10.567 12:04:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:11.134 12:04:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:11.394 [2024-07-15 12:04:24.771422] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:11.394 [2024-07-15 12:04:24.771461] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c3a390 name Existed_Raid, state configuring 00:23:11.394 12:04:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:11.653 [2024-07-15 12:04:25.020117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:11.653 [2024-07-15 12:04:25.021555] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:23:11.653 [2024-07-15 12:04:25.021586] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:23:11.653 [2024-07-15 12:04:25.021596] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:23:11.653 [2024-07-15 12:04:25.021608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:23:11.653 [2024-07-15 12:04:25.021616] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:23:11.653 [2024-07-15 12:04:25.021627] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.653 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:11.912 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.912 "name": "Existed_Raid", 00:23:11.912 "uuid": "4fcae223-0b54-4ae0-838c-4fe53cd19a97", 00:23:11.912 "strip_size_kb": 0, 00:23:11.912 "state": "configuring", 00:23:11.912 "raid_level": "raid1", 00:23:11.912 "superblock": true, 00:23:11.912 "num_base_bdevs": 4, 00:23:11.912 "num_base_bdevs_discovered": 1, 00:23:11.912 "num_base_bdevs_operational": 4, 00:23:11.912 "base_bdevs_list": [ 00:23:11.912 { 00:23:11.912 "name": "BaseBdev1", 00:23:11.912 "uuid": "306f193a-b110-46d9-82ee-09b8f01b7381", 00:23:11.912 "is_configured": true, 00:23:11.912 "data_offset": 2048, 00:23:11.912 "data_size": 63488 00:23:11.912 }, 00:23:11.912 { 00:23:11.912 "name": "BaseBdev2", 00:23:11.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.912 "is_configured": false, 00:23:11.912 "data_offset": 0, 00:23:11.912 "data_size": 0 00:23:11.912 }, 00:23:11.912 { 00:23:11.912 "name": "BaseBdev3", 00:23:11.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.912 "is_configured": false, 00:23:11.912 "data_offset": 0, 00:23:11.912 "data_size": 0 00:23:11.912 }, 00:23:11.912 { 00:23:11.912 "name": "BaseBdev4", 00:23:11.912 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.912 "is_configured": false, 00:23:11.913 "data_offset": 0, 00:23:11.913 "data_size": 0 00:23:11.913 } 00:23:11.913 ] 00:23:11.913 }' 00:23:11.913 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.913 12:04:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:12.481 12:04:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:12.740 [2024-07-15 12:04:26.118370] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:12.740 BaseBdev2 00:23:12.740 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:23:12.740 12:04:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:23:12.740 12:04:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:12.740 12:04:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:12.740 12:04:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:12.740 12:04:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:12.740 12:04:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:12.999 12:04:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:13.258 [ 00:23:13.258 { 00:23:13.258 "name": "BaseBdev2", 00:23:13.258 "aliases": [ 00:23:13.258 "af8e513f-6d39-4b5f-891f-b61ee005efe2" 00:23:13.258 ], 00:23:13.258 "product_name": "Malloc disk", 00:23:13.258 "block_size": 512, 00:23:13.258 "num_blocks": 65536, 00:23:13.258 "uuid": "af8e513f-6d39-4b5f-891f-b61ee005efe2", 00:23:13.258 "assigned_rate_limits": { 00:23:13.258 "rw_ios_per_sec": 0, 00:23:13.258 "rw_mbytes_per_sec": 0, 00:23:13.258 "r_mbytes_per_sec": 0, 00:23:13.258 "w_mbytes_per_sec": 0 00:23:13.258 }, 00:23:13.258 "claimed": true, 00:23:13.258 "claim_type": "exclusive_write", 00:23:13.258 "zoned": false, 00:23:13.258 "supported_io_types": { 00:23:13.258 "read": true, 00:23:13.258 "write": true, 00:23:13.258 "unmap": true, 00:23:13.258 "flush": true, 00:23:13.258 "reset": true, 00:23:13.258 "nvme_admin": false, 00:23:13.258 "nvme_io": false, 00:23:13.258 "nvme_io_md": false, 00:23:13.258 "write_zeroes": true, 00:23:13.258 "zcopy": true, 00:23:13.258 "get_zone_info": false, 00:23:13.258 "zone_management": false, 00:23:13.258 "zone_append": false, 00:23:13.258 "compare": false, 00:23:13.258 "compare_and_write": false, 00:23:13.258 "abort": true, 00:23:13.258 "seek_hole": false, 00:23:13.258 "seek_data": false, 00:23:13.258 "copy": true, 00:23:13.258 "nvme_iov_md": false 00:23:13.258 }, 00:23:13.258 "memory_domains": [ 00:23:13.258 { 00:23:13.258 "dma_device_id": "system", 00:23:13.258 "dma_device_type": 1 00:23:13.258 }, 00:23:13.258 { 00:23:13.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:13.258 "dma_device_type": 2 00:23:13.258 } 00:23:13.258 ], 00:23:13.258 "driver_specific": {} 00:23:13.258 } 00:23:13.258 ] 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.258 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:13.518 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.518 "name": "Existed_Raid", 00:23:13.518 "uuid": "4fcae223-0b54-4ae0-838c-4fe53cd19a97", 00:23:13.518 "strip_size_kb": 0, 00:23:13.518 "state": "configuring", 00:23:13.518 "raid_level": "raid1", 00:23:13.518 "superblock": true, 00:23:13.518 "num_base_bdevs": 4, 00:23:13.518 "num_base_bdevs_discovered": 2, 00:23:13.518 "num_base_bdevs_operational": 4, 00:23:13.518 "base_bdevs_list": [ 00:23:13.518 { 00:23:13.518 "name": "BaseBdev1", 00:23:13.518 "uuid": "306f193a-b110-46d9-82ee-09b8f01b7381", 00:23:13.518 "is_configured": true, 00:23:13.518 "data_offset": 2048, 00:23:13.518 "data_size": 63488 00:23:13.518 }, 00:23:13.518 { 00:23:13.518 "name": "BaseBdev2", 00:23:13.518 "uuid": "af8e513f-6d39-4b5f-891f-b61ee005efe2", 00:23:13.518 "is_configured": true, 00:23:13.518 "data_offset": 2048, 00:23:13.518 "data_size": 63488 00:23:13.518 }, 00:23:13.518 { 00:23:13.518 "name": "BaseBdev3", 00:23:13.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.518 "is_configured": false, 00:23:13.518 "data_offset": 0, 00:23:13.518 "data_size": 0 00:23:13.518 }, 00:23:13.518 { 00:23:13.518 "name": "BaseBdev4", 00:23:13.518 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.518 "is_configured": false, 00:23:13.518 "data_offset": 0, 00:23:13.518 "data_size": 0 00:23:13.518 } 00:23:13.518 ] 00:23:13.518 }' 00:23:13.518 12:04:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.518 12:04:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:14.086 12:04:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:14.086 [2024-07-15 12:04:27.665937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:14.086 BaseBdev3 00:23:14.345 12:04:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:23:14.345 12:04:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:23:14.345 12:04:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:14.345 12:04:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:14.345 12:04:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:14.345 12:04:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:14.345 12:04:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:14.345 12:04:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:14.604 [ 00:23:14.604 { 00:23:14.604 "name": "BaseBdev3", 00:23:14.604 "aliases": [ 00:23:14.604 "8672da63-bbd9-4e98-8059-65b19c39e5ee" 00:23:14.604 ], 00:23:14.604 "product_name": "Malloc disk", 00:23:14.604 "block_size": 512, 00:23:14.604 "num_blocks": 65536, 00:23:14.604 "uuid": "8672da63-bbd9-4e98-8059-65b19c39e5ee", 00:23:14.604 "assigned_rate_limits": { 00:23:14.604 "rw_ios_per_sec": 0, 00:23:14.604 "rw_mbytes_per_sec": 0, 00:23:14.604 "r_mbytes_per_sec": 0, 00:23:14.604 "w_mbytes_per_sec": 0 00:23:14.604 }, 00:23:14.604 "claimed": true, 00:23:14.604 "claim_type": "exclusive_write", 00:23:14.604 "zoned": false, 00:23:14.604 "supported_io_types": { 00:23:14.604 "read": true, 00:23:14.604 "write": true, 00:23:14.604 "unmap": true, 00:23:14.604 "flush": true, 00:23:14.604 "reset": true, 00:23:14.604 "nvme_admin": false, 00:23:14.604 "nvme_io": false, 00:23:14.604 "nvme_io_md": false, 00:23:14.604 "write_zeroes": true, 00:23:14.604 "zcopy": true, 00:23:14.604 "get_zone_info": false, 00:23:14.604 "zone_management": false, 00:23:14.604 "zone_append": false, 00:23:14.604 "compare": false, 00:23:14.604 "compare_and_write": false, 00:23:14.604 "abort": true, 00:23:14.604 "seek_hole": false, 00:23:14.604 "seek_data": false, 00:23:14.604 "copy": true, 00:23:14.604 "nvme_iov_md": false 00:23:14.604 }, 00:23:14.604 "memory_domains": [ 00:23:14.604 { 00:23:14.604 "dma_device_id": "system", 00:23:14.604 "dma_device_type": 1 00:23:14.604 }, 00:23:14.604 { 00:23:14.604 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:14.604 "dma_device_type": 2 00:23:14.604 } 00:23:14.604 ], 00:23:14.604 "driver_specific": {} 00:23:14.604 } 00:23:14.604 ] 00:23:14.604 12:04:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:14.604 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:14.604 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:14.604 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:14.605 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:14.864 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:14.864 "name": "Existed_Raid", 00:23:14.864 "uuid": "4fcae223-0b54-4ae0-838c-4fe53cd19a97", 00:23:14.864 "strip_size_kb": 0, 00:23:14.864 "state": "configuring", 00:23:14.864 "raid_level": "raid1", 00:23:14.864 "superblock": true, 00:23:14.864 "num_base_bdevs": 4, 00:23:14.864 "num_base_bdevs_discovered": 3, 00:23:14.864 "num_base_bdevs_operational": 4, 00:23:14.864 "base_bdevs_list": [ 00:23:14.864 { 00:23:14.864 "name": "BaseBdev1", 00:23:14.864 "uuid": "306f193a-b110-46d9-82ee-09b8f01b7381", 00:23:14.864 "is_configured": true, 00:23:14.864 "data_offset": 2048, 00:23:14.864 "data_size": 63488 00:23:14.864 }, 00:23:14.864 { 00:23:14.864 "name": "BaseBdev2", 00:23:14.864 "uuid": "af8e513f-6d39-4b5f-891f-b61ee005efe2", 00:23:14.864 "is_configured": true, 00:23:14.864 "data_offset": 2048, 00:23:14.864 "data_size": 63488 00:23:14.864 }, 00:23:14.864 { 00:23:14.864 "name": "BaseBdev3", 00:23:14.864 "uuid": "8672da63-bbd9-4e98-8059-65b19c39e5ee", 00:23:14.864 "is_configured": true, 00:23:14.864 "data_offset": 2048, 00:23:14.864 "data_size": 63488 00:23:14.864 }, 00:23:14.864 { 00:23:14.864 "name": "BaseBdev4", 00:23:14.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.864 "is_configured": false, 00:23:14.864 "data_offset": 0, 00:23:14.864 "data_size": 0 00:23:14.864 } 00:23:14.864 ] 00:23:14.864 }' 00:23:14.864 12:04:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:14.864 12:04:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:15.801 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:15.801 [2024-07-15 12:04:29.277586] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:15.801 [2024-07-15 12:04:29.277767] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c3b4a0 00:23:15.801 [2024-07-15 12:04:29.277781] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:15.801 [2024-07-15 12:04:29.277962] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3b0a0 00:23:15.801 [2024-07-15 12:04:29.278092] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c3b4a0 00:23:15.801 [2024-07-15 12:04:29.278102] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c3b4a0 00:23:15.801 [2024-07-15 12:04:29.278197] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:15.801 BaseBdev4 00:23:15.801 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:23:15.801 12:04:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:23:15.801 12:04:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:15.801 12:04:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:15.801 12:04:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:15.801 12:04:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:15.801 12:04:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:16.061 12:04:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:16.319 [ 00:23:16.319 { 00:23:16.319 "name": "BaseBdev4", 00:23:16.319 "aliases": [ 00:23:16.319 "4b789b9d-377d-489a-8e70-12a023596fe3" 00:23:16.319 ], 00:23:16.319 "product_name": "Malloc disk", 00:23:16.319 "block_size": 512, 00:23:16.319 "num_blocks": 65536, 00:23:16.319 "uuid": "4b789b9d-377d-489a-8e70-12a023596fe3", 00:23:16.319 "assigned_rate_limits": { 00:23:16.319 "rw_ios_per_sec": 0, 00:23:16.319 "rw_mbytes_per_sec": 0, 00:23:16.319 "r_mbytes_per_sec": 0, 00:23:16.319 "w_mbytes_per_sec": 0 00:23:16.319 }, 00:23:16.319 "claimed": true, 00:23:16.319 "claim_type": "exclusive_write", 00:23:16.319 "zoned": false, 00:23:16.319 "supported_io_types": { 00:23:16.319 "read": true, 00:23:16.319 "write": true, 00:23:16.319 "unmap": true, 00:23:16.319 "flush": true, 00:23:16.319 "reset": true, 00:23:16.319 "nvme_admin": false, 00:23:16.319 "nvme_io": false, 00:23:16.319 "nvme_io_md": false, 00:23:16.319 "write_zeroes": true, 00:23:16.319 "zcopy": true, 00:23:16.319 "get_zone_info": false, 00:23:16.319 "zone_management": false, 00:23:16.319 "zone_append": false, 00:23:16.319 "compare": false, 00:23:16.319 "compare_and_write": false, 00:23:16.319 "abort": true, 00:23:16.319 "seek_hole": false, 00:23:16.319 "seek_data": false, 00:23:16.319 "copy": true, 00:23:16.319 "nvme_iov_md": false 00:23:16.319 }, 00:23:16.319 "memory_domains": [ 00:23:16.319 { 00:23:16.319 "dma_device_id": "system", 00:23:16.319 "dma_device_type": 1 00:23:16.319 }, 00:23:16.319 { 00:23:16.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:16.319 "dma_device_type": 2 00:23:16.319 } 00:23:16.319 ], 00:23:16.319 "driver_specific": {} 00:23:16.319 } 00:23:16.319 ] 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:16.319 12:04:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:16.577 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:16.577 "name": "Existed_Raid", 00:23:16.577 "uuid": "4fcae223-0b54-4ae0-838c-4fe53cd19a97", 00:23:16.577 "strip_size_kb": 0, 00:23:16.577 "state": "online", 00:23:16.577 "raid_level": "raid1", 00:23:16.577 "superblock": true, 00:23:16.577 "num_base_bdevs": 4, 00:23:16.577 "num_base_bdevs_discovered": 4, 00:23:16.577 "num_base_bdevs_operational": 4, 00:23:16.577 "base_bdevs_list": [ 00:23:16.577 { 00:23:16.577 "name": "BaseBdev1", 00:23:16.577 "uuid": "306f193a-b110-46d9-82ee-09b8f01b7381", 00:23:16.577 "is_configured": true, 00:23:16.577 "data_offset": 2048, 00:23:16.577 "data_size": 63488 00:23:16.577 }, 00:23:16.577 { 00:23:16.577 "name": "BaseBdev2", 00:23:16.577 "uuid": "af8e513f-6d39-4b5f-891f-b61ee005efe2", 00:23:16.577 "is_configured": true, 00:23:16.577 "data_offset": 2048, 00:23:16.577 "data_size": 63488 00:23:16.577 }, 00:23:16.577 { 00:23:16.577 "name": "BaseBdev3", 00:23:16.577 "uuid": "8672da63-bbd9-4e98-8059-65b19c39e5ee", 00:23:16.577 "is_configured": true, 00:23:16.577 "data_offset": 2048, 00:23:16.577 "data_size": 63488 00:23:16.577 }, 00:23:16.577 { 00:23:16.577 "name": "BaseBdev4", 00:23:16.577 "uuid": "4b789b9d-377d-489a-8e70-12a023596fe3", 00:23:16.577 "is_configured": true, 00:23:16.577 "data_offset": 2048, 00:23:16.577 "data_size": 63488 00:23:16.577 } 00:23:16.577 ] 00:23:16.577 }' 00:23:16.577 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:16.577 12:04:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:17.144 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:23:17.144 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:17.144 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:17.144 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:17.145 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:17.145 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:23:17.145 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:17.145 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:17.403 [2024-07-15 12:04:30.934323] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:17.403 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:17.403 "name": "Existed_Raid", 00:23:17.403 "aliases": [ 00:23:17.403 "4fcae223-0b54-4ae0-838c-4fe53cd19a97" 00:23:17.403 ], 00:23:17.403 "product_name": "Raid Volume", 00:23:17.403 "block_size": 512, 00:23:17.403 "num_blocks": 63488, 00:23:17.403 "uuid": "4fcae223-0b54-4ae0-838c-4fe53cd19a97", 00:23:17.403 "assigned_rate_limits": { 00:23:17.403 "rw_ios_per_sec": 0, 00:23:17.403 "rw_mbytes_per_sec": 0, 00:23:17.403 "r_mbytes_per_sec": 0, 00:23:17.403 "w_mbytes_per_sec": 0 00:23:17.403 }, 00:23:17.403 "claimed": false, 00:23:17.403 "zoned": false, 00:23:17.403 "supported_io_types": { 00:23:17.403 "read": true, 00:23:17.403 "write": true, 00:23:17.403 "unmap": false, 00:23:17.403 "flush": false, 00:23:17.403 "reset": true, 00:23:17.403 "nvme_admin": false, 00:23:17.403 "nvme_io": false, 00:23:17.403 "nvme_io_md": false, 00:23:17.403 "write_zeroes": true, 00:23:17.403 "zcopy": false, 00:23:17.403 "get_zone_info": false, 00:23:17.403 "zone_management": false, 00:23:17.403 "zone_append": false, 00:23:17.403 "compare": false, 00:23:17.403 "compare_and_write": false, 00:23:17.403 "abort": false, 00:23:17.403 "seek_hole": false, 00:23:17.403 "seek_data": false, 00:23:17.403 "copy": false, 00:23:17.403 "nvme_iov_md": false 00:23:17.403 }, 00:23:17.403 "memory_domains": [ 00:23:17.403 { 00:23:17.403 "dma_device_id": "system", 00:23:17.403 "dma_device_type": 1 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:17.403 "dma_device_type": 2 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "dma_device_id": "system", 00:23:17.403 "dma_device_type": 1 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:17.403 "dma_device_type": 2 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "dma_device_id": "system", 00:23:17.403 "dma_device_type": 1 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:17.403 "dma_device_type": 2 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "dma_device_id": "system", 00:23:17.403 "dma_device_type": 1 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:17.403 "dma_device_type": 2 00:23:17.403 } 00:23:17.403 ], 00:23:17.403 "driver_specific": { 00:23:17.403 "raid": { 00:23:17.403 "uuid": "4fcae223-0b54-4ae0-838c-4fe53cd19a97", 00:23:17.403 "strip_size_kb": 0, 00:23:17.403 "state": "online", 00:23:17.403 "raid_level": "raid1", 00:23:17.403 "superblock": true, 00:23:17.403 "num_base_bdevs": 4, 00:23:17.403 "num_base_bdevs_discovered": 4, 00:23:17.403 "num_base_bdevs_operational": 4, 00:23:17.403 "base_bdevs_list": [ 00:23:17.403 { 00:23:17.403 "name": "BaseBdev1", 00:23:17.403 "uuid": "306f193a-b110-46d9-82ee-09b8f01b7381", 00:23:17.403 "is_configured": true, 00:23:17.403 "data_offset": 2048, 00:23:17.403 "data_size": 63488 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "name": "BaseBdev2", 00:23:17.403 "uuid": "af8e513f-6d39-4b5f-891f-b61ee005efe2", 00:23:17.403 "is_configured": true, 00:23:17.403 "data_offset": 2048, 00:23:17.403 "data_size": 63488 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "name": "BaseBdev3", 00:23:17.403 "uuid": "8672da63-bbd9-4e98-8059-65b19c39e5ee", 00:23:17.403 "is_configured": true, 00:23:17.403 "data_offset": 2048, 00:23:17.403 "data_size": 63488 00:23:17.403 }, 00:23:17.403 { 00:23:17.403 "name": "BaseBdev4", 00:23:17.403 "uuid": "4b789b9d-377d-489a-8e70-12a023596fe3", 00:23:17.403 "is_configured": true, 00:23:17.403 "data_offset": 2048, 00:23:17.403 "data_size": 63488 00:23:17.403 } 00:23:17.403 ] 00:23:17.403 } 00:23:17.403 } 00:23:17.403 }' 00:23:17.403 12:04:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:17.662 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:23:17.662 BaseBdev2 00:23:17.662 BaseBdev3 00:23:17.662 BaseBdev4' 00:23:17.662 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:17.662 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:23:17.662 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:17.662 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:17.662 "name": "BaseBdev1", 00:23:17.662 "aliases": [ 00:23:17.662 "306f193a-b110-46d9-82ee-09b8f01b7381" 00:23:17.662 ], 00:23:17.662 "product_name": "Malloc disk", 00:23:17.662 "block_size": 512, 00:23:17.662 "num_blocks": 65536, 00:23:17.662 "uuid": "306f193a-b110-46d9-82ee-09b8f01b7381", 00:23:17.662 "assigned_rate_limits": { 00:23:17.662 "rw_ios_per_sec": 0, 00:23:17.662 "rw_mbytes_per_sec": 0, 00:23:17.662 "r_mbytes_per_sec": 0, 00:23:17.662 "w_mbytes_per_sec": 0 00:23:17.662 }, 00:23:17.662 "claimed": true, 00:23:17.662 "claim_type": "exclusive_write", 00:23:17.662 "zoned": false, 00:23:17.662 "supported_io_types": { 00:23:17.662 "read": true, 00:23:17.662 "write": true, 00:23:17.662 "unmap": true, 00:23:17.662 "flush": true, 00:23:17.662 "reset": true, 00:23:17.662 "nvme_admin": false, 00:23:17.662 "nvme_io": false, 00:23:17.662 "nvme_io_md": false, 00:23:17.662 "write_zeroes": true, 00:23:17.662 "zcopy": true, 00:23:17.662 "get_zone_info": false, 00:23:17.662 "zone_management": false, 00:23:17.662 "zone_append": false, 00:23:17.662 "compare": false, 00:23:17.662 "compare_and_write": false, 00:23:17.662 "abort": true, 00:23:17.662 "seek_hole": false, 00:23:17.662 "seek_data": false, 00:23:17.662 "copy": true, 00:23:17.662 "nvme_iov_md": false 00:23:17.662 }, 00:23:17.662 "memory_domains": [ 00:23:17.662 { 00:23:17.662 "dma_device_id": "system", 00:23:17.662 "dma_device_type": 1 00:23:17.662 }, 00:23:17.662 { 00:23:17.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:17.662 "dma_device_type": 2 00:23:17.662 } 00:23:17.662 ], 00:23:17.662 "driver_specific": {} 00:23:17.662 }' 00:23:17.662 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:17.940 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:17.940 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:17.940 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:17.940 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:17.940 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:17.940 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:17.940 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:17.940 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:17.940 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:18.199 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:18.199 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:18.199 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:18.199 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:18.199 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:18.457 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:18.457 "name": "BaseBdev2", 00:23:18.457 "aliases": [ 00:23:18.457 "af8e513f-6d39-4b5f-891f-b61ee005efe2" 00:23:18.457 ], 00:23:18.457 "product_name": "Malloc disk", 00:23:18.457 "block_size": 512, 00:23:18.457 "num_blocks": 65536, 00:23:18.457 "uuid": "af8e513f-6d39-4b5f-891f-b61ee005efe2", 00:23:18.457 "assigned_rate_limits": { 00:23:18.457 "rw_ios_per_sec": 0, 00:23:18.457 "rw_mbytes_per_sec": 0, 00:23:18.457 "r_mbytes_per_sec": 0, 00:23:18.457 "w_mbytes_per_sec": 0 00:23:18.457 }, 00:23:18.457 "claimed": true, 00:23:18.457 "claim_type": "exclusive_write", 00:23:18.457 "zoned": false, 00:23:18.457 "supported_io_types": { 00:23:18.457 "read": true, 00:23:18.457 "write": true, 00:23:18.457 "unmap": true, 00:23:18.457 "flush": true, 00:23:18.457 "reset": true, 00:23:18.457 "nvme_admin": false, 00:23:18.457 "nvme_io": false, 00:23:18.457 "nvme_io_md": false, 00:23:18.457 "write_zeroes": true, 00:23:18.457 "zcopy": true, 00:23:18.457 "get_zone_info": false, 00:23:18.457 "zone_management": false, 00:23:18.457 "zone_append": false, 00:23:18.457 "compare": false, 00:23:18.457 "compare_and_write": false, 00:23:18.457 "abort": true, 00:23:18.457 "seek_hole": false, 00:23:18.457 "seek_data": false, 00:23:18.457 "copy": true, 00:23:18.457 "nvme_iov_md": false 00:23:18.457 }, 00:23:18.457 "memory_domains": [ 00:23:18.457 { 00:23:18.457 "dma_device_id": "system", 00:23:18.457 "dma_device_type": 1 00:23:18.457 }, 00:23:18.457 { 00:23:18.457 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:18.457 "dma_device_type": 2 00:23:18.457 } 00:23:18.457 ], 00:23:18.457 "driver_specific": {} 00:23:18.457 }' 00:23:18.457 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:18.457 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:18.457 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:18.457 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:18.457 12:04:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:18.457 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:18.457 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:18.715 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:18.715 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:18.715 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:18.715 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:18.715 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:18.715 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:18.715 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:18.715 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:18.974 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:18.974 "name": "BaseBdev3", 00:23:18.974 "aliases": [ 00:23:18.974 "8672da63-bbd9-4e98-8059-65b19c39e5ee" 00:23:18.974 ], 00:23:18.974 "product_name": "Malloc disk", 00:23:18.974 "block_size": 512, 00:23:18.974 "num_blocks": 65536, 00:23:18.974 "uuid": "8672da63-bbd9-4e98-8059-65b19c39e5ee", 00:23:18.974 "assigned_rate_limits": { 00:23:18.974 "rw_ios_per_sec": 0, 00:23:18.974 "rw_mbytes_per_sec": 0, 00:23:18.974 "r_mbytes_per_sec": 0, 00:23:18.974 "w_mbytes_per_sec": 0 00:23:18.974 }, 00:23:18.974 "claimed": true, 00:23:18.974 "claim_type": "exclusive_write", 00:23:18.974 "zoned": false, 00:23:18.974 "supported_io_types": { 00:23:18.974 "read": true, 00:23:18.974 "write": true, 00:23:18.974 "unmap": true, 00:23:18.974 "flush": true, 00:23:18.974 "reset": true, 00:23:18.974 "nvme_admin": false, 00:23:18.974 "nvme_io": false, 00:23:18.974 "nvme_io_md": false, 00:23:18.974 "write_zeroes": true, 00:23:18.974 "zcopy": true, 00:23:18.974 "get_zone_info": false, 00:23:18.974 "zone_management": false, 00:23:18.974 "zone_append": false, 00:23:18.974 "compare": false, 00:23:18.974 "compare_and_write": false, 00:23:18.974 "abort": true, 00:23:18.974 "seek_hole": false, 00:23:18.974 "seek_data": false, 00:23:18.974 "copy": true, 00:23:18.974 "nvme_iov_md": false 00:23:18.974 }, 00:23:18.974 "memory_domains": [ 00:23:18.974 { 00:23:18.974 "dma_device_id": "system", 00:23:18.974 "dma_device_type": 1 00:23:18.974 }, 00:23:18.974 { 00:23:18.974 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:18.974 "dma_device_type": 2 00:23:18.974 } 00:23:18.974 ], 00:23:18.974 "driver_specific": {} 00:23:18.974 }' 00:23:18.974 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:18.974 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:19.233 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:19.233 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:19.233 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:19.233 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:19.233 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:19.233 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:19.233 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:19.233 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:19.233 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:19.492 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:19.492 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:19.492 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:19.492 12:04:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:19.751 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:19.751 "name": "BaseBdev4", 00:23:19.751 "aliases": [ 00:23:19.751 "4b789b9d-377d-489a-8e70-12a023596fe3" 00:23:19.751 ], 00:23:19.751 "product_name": "Malloc disk", 00:23:19.751 "block_size": 512, 00:23:19.751 "num_blocks": 65536, 00:23:19.751 "uuid": "4b789b9d-377d-489a-8e70-12a023596fe3", 00:23:19.751 "assigned_rate_limits": { 00:23:19.751 "rw_ios_per_sec": 0, 00:23:19.751 "rw_mbytes_per_sec": 0, 00:23:19.751 "r_mbytes_per_sec": 0, 00:23:19.751 "w_mbytes_per_sec": 0 00:23:19.751 }, 00:23:19.751 "claimed": true, 00:23:19.751 "claim_type": "exclusive_write", 00:23:19.751 "zoned": false, 00:23:19.751 "supported_io_types": { 00:23:19.751 "read": true, 00:23:19.751 "write": true, 00:23:19.751 "unmap": true, 00:23:19.751 "flush": true, 00:23:19.751 "reset": true, 00:23:19.751 "nvme_admin": false, 00:23:19.751 "nvme_io": false, 00:23:19.751 "nvme_io_md": false, 00:23:19.751 "write_zeroes": true, 00:23:19.751 "zcopy": true, 00:23:19.751 "get_zone_info": false, 00:23:19.751 "zone_management": false, 00:23:19.751 "zone_append": false, 00:23:19.751 "compare": false, 00:23:19.751 "compare_and_write": false, 00:23:19.751 "abort": true, 00:23:19.751 "seek_hole": false, 00:23:19.751 "seek_data": false, 00:23:19.751 "copy": true, 00:23:19.751 "nvme_iov_md": false 00:23:19.751 }, 00:23:19.751 "memory_domains": [ 00:23:19.751 { 00:23:19.751 "dma_device_id": "system", 00:23:19.751 "dma_device_type": 1 00:23:19.751 }, 00:23:19.751 { 00:23:19.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:19.751 "dma_device_type": 2 00:23:19.751 } 00:23:19.751 ], 00:23:19.751 "driver_specific": {} 00:23:19.751 }' 00:23:19.751 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:19.751 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:19.751 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:19.751 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:19.751 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:19.751 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:19.751 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:19.751 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:20.009 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:20.009 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:20.009 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:20.009 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:20.009 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:20.268 [2024-07-15 12:04:33.689343] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:20.268 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.528 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:20.528 "name": "Existed_Raid", 00:23:20.528 "uuid": "4fcae223-0b54-4ae0-838c-4fe53cd19a97", 00:23:20.528 "strip_size_kb": 0, 00:23:20.528 "state": "online", 00:23:20.528 "raid_level": "raid1", 00:23:20.528 "superblock": true, 00:23:20.528 "num_base_bdevs": 4, 00:23:20.528 "num_base_bdevs_discovered": 3, 00:23:20.528 "num_base_bdevs_operational": 3, 00:23:20.528 "base_bdevs_list": [ 00:23:20.528 { 00:23:20.528 "name": null, 00:23:20.528 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:20.528 "is_configured": false, 00:23:20.528 "data_offset": 2048, 00:23:20.528 "data_size": 63488 00:23:20.528 }, 00:23:20.528 { 00:23:20.528 "name": "BaseBdev2", 00:23:20.528 "uuid": "af8e513f-6d39-4b5f-891f-b61ee005efe2", 00:23:20.528 "is_configured": true, 00:23:20.528 "data_offset": 2048, 00:23:20.528 "data_size": 63488 00:23:20.528 }, 00:23:20.528 { 00:23:20.528 "name": "BaseBdev3", 00:23:20.528 "uuid": "8672da63-bbd9-4e98-8059-65b19c39e5ee", 00:23:20.528 "is_configured": true, 00:23:20.528 "data_offset": 2048, 00:23:20.528 "data_size": 63488 00:23:20.528 }, 00:23:20.528 { 00:23:20.528 "name": "BaseBdev4", 00:23:20.528 "uuid": "4b789b9d-377d-489a-8e70-12a023596fe3", 00:23:20.528 "is_configured": true, 00:23:20.528 "data_offset": 2048, 00:23:20.528 "data_size": 63488 00:23:20.528 } 00:23:20.528 ] 00:23:20.528 }' 00:23:20.528 12:04:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:20.528 12:04:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:21.153 12:04:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:23:21.153 12:04:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:21.153 12:04:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.153 12:04:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:21.412 12:04:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:21.412 12:04:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:21.412 12:04:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:23:21.671 [2024-07-15 12:04:35.045965] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:21.671 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:21.671 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:21.671 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.671 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:21.930 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:21.930 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:21.930 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:23:22.188 [2024-07-15 12:04:35.551581] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:22.188 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:22.188 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:22.188 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.188 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:23:22.446 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:23:22.446 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:23:22.446 12:04:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:23:22.705 [2024-07-15 12:04:36.107634] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:23:22.705 [2024-07-15 12:04:36.107727] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:22.705 [2024-07-15 12:04:36.118583] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:22.705 [2024-07-15 12:04:36.118620] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:22.705 [2024-07-15 12:04:36.118632] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c3b4a0 name Existed_Raid, state offline 00:23:22.705 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:23:22.705 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:23:22.705 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:22.705 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:23:23.274 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:23:23.274 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:23:23.274 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:23:23.274 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:23:23.274 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:23.274 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:23:23.534 BaseBdev2 00:23:23.534 12:04:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:23:23.534 12:04:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:23:23.534 12:04:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:23.534 12:04:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:23.534 12:04:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:23.534 12:04:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:23.534 12:04:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:23.793 12:04:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:23:24.052 [ 00:23:24.052 { 00:23:24.052 "name": "BaseBdev2", 00:23:24.052 "aliases": [ 00:23:24.052 "8dd449ee-27d2-400c-8ff1-5296ee9605c9" 00:23:24.052 ], 00:23:24.052 "product_name": "Malloc disk", 00:23:24.052 "block_size": 512, 00:23:24.052 "num_blocks": 65536, 00:23:24.052 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:24.052 "assigned_rate_limits": { 00:23:24.052 "rw_ios_per_sec": 0, 00:23:24.052 "rw_mbytes_per_sec": 0, 00:23:24.052 "r_mbytes_per_sec": 0, 00:23:24.052 "w_mbytes_per_sec": 0 00:23:24.052 }, 00:23:24.052 "claimed": false, 00:23:24.052 "zoned": false, 00:23:24.052 "supported_io_types": { 00:23:24.052 "read": true, 00:23:24.052 "write": true, 00:23:24.052 "unmap": true, 00:23:24.052 "flush": true, 00:23:24.052 "reset": true, 00:23:24.052 "nvme_admin": false, 00:23:24.052 "nvme_io": false, 00:23:24.052 "nvme_io_md": false, 00:23:24.052 "write_zeroes": true, 00:23:24.052 "zcopy": true, 00:23:24.052 "get_zone_info": false, 00:23:24.052 "zone_management": false, 00:23:24.052 "zone_append": false, 00:23:24.052 "compare": false, 00:23:24.052 "compare_and_write": false, 00:23:24.052 "abort": true, 00:23:24.052 "seek_hole": false, 00:23:24.052 "seek_data": false, 00:23:24.052 "copy": true, 00:23:24.052 "nvme_iov_md": false 00:23:24.052 }, 00:23:24.052 "memory_domains": [ 00:23:24.052 { 00:23:24.052 "dma_device_id": "system", 00:23:24.052 "dma_device_type": 1 00:23:24.052 }, 00:23:24.052 { 00:23:24.052 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.052 "dma_device_type": 2 00:23:24.052 } 00:23:24.052 ], 00:23:24.052 "driver_specific": {} 00:23:24.052 } 00:23:24.052 ] 00:23:24.052 12:04:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:24.052 12:04:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:24.052 12:04:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:24.052 12:04:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:23:24.620 BaseBdev3 00:23:24.620 12:04:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:23:24.620 12:04:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev3 00:23:24.620 12:04:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:24.620 12:04:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:24.621 12:04:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:24.621 12:04:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:24.621 12:04:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:24.621 12:04:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:23:24.880 [ 00:23:24.880 { 00:23:24.880 "name": "BaseBdev3", 00:23:24.880 "aliases": [ 00:23:24.880 "fa3d6c09-e3da-4f43-9143-bb81c02d53e6" 00:23:24.880 ], 00:23:24.880 "product_name": "Malloc disk", 00:23:24.880 "block_size": 512, 00:23:24.880 "num_blocks": 65536, 00:23:24.880 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:24.880 "assigned_rate_limits": { 00:23:24.880 "rw_ios_per_sec": 0, 00:23:24.880 "rw_mbytes_per_sec": 0, 00:23:24.880 "r_mbytes_per_sec": 0, 00:23:24.880 "w_mbytes_per_sec": 0 00:23:24.880 }, 00:23:24.880 "claimed": false, 00:23:24.880 "zoned": false, 00:23:24.880 "supported_io_types": { 00:23:24.880 "read": true, 00:23:24.880 "write": true, 00:23:24.880 "unmap": true, 00:23:24.880 "flush": true, 00:23:24.880 "reset": true, 00:23:24.880 "nvme_admin": false, 00:23:24.880 "nvme_io": false, 00:23:24.880 "nvme_io_md": false, 00:23:24.880 "write_zeroes": true, 00:23:24.880 "zcopy": true, 00:23:24.880 "get_zone_info": false, 00:23:24.880 "zone_management": false, 00:23:24.880 "zone_append": false, 00:23:24.880 "compare": false, 00:23:24.880 "compare_and_write": false, 00:23:24.880 "abort": true, 00:23:24.880 "seek_hole": false, 00:23:24.880 "seek_data": false, 00:23:24.880 "copy": true, 00:23:24.880 "nvme_iov_md": false 00:23:24.880 }, 00:23:24.880 "memory_domains": [ 00:23:24.880 { 00:23:24.880 "dma_device_id": "system", 00:23:24.880 "dma_device_type": 1 00:23:24.880 }, 00:23:24.880 { 00:23:24.880 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:24.880 "dma_device_type": 2 00:23:24.880 } 00:23:24.880 ], 00:23:24.880 "driver_specific": {} 00:23:24.880 } 00:23:24.880 ] 00:23:24.880 12:04:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:24.880 12:04:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:24.880 12:04:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:24.880 12:04:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:23:25.140 BaseBdev4 00:23:25.140 12:04:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:23:25.140 12:04:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev4 00:23:25.140 12:04:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:25.140 12:04:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:25.140 12:04:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:25.140 12:04:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:25.140 12:04:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:25.399 12:04:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:23:25.660 [ 00:23:25.660 { 00:23:25.660 "name": "BaseBdev4", 00:23:25.660 "aliases": [ 00:23:25.660 "796d2778-4481-48d8-b032-4ebc69e10908" 00:23:25.660 ], 00:23:25.660 "product_name": "Malloc disk", 00:23:25.660 "block_size": 512, 00:23:25.660 "num_blocks": 65536, 00:23:25.660 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:25.660 "assigned_rate_limits": { 00:23:25.660 "rw_ios_per_sec": 0, 00:23:25.660 "rw_mbytes_per_sec": 0, 00:23:25.660 "r_mbytes_per_sec": 0, 00:23:25.660 "w_mbytes_per_sec": 0 00:23:25.660 }, 00:23:25.660 "claimed": false, 00:23:25.660 "zoned": false, 00:23:25.660 "supported_io_types": { 00:23:25.660 "read": true, 00:23:25.660 "write": true, 00:23:25.660 "unmap": true, 00:23:25.660 "flush": true, 00:23:25.660 "reset": true, 00:23:25.660 "nvme_admin": false, 00:23:25.660 "nvme_io": false, 00:23:25.660 "nvme_io_md": false, 00:23:25.660 "write_zeroes": true, 00:23:25.660 "zcopy": true, 00:23:25.660 "get_zone_info": false, 00:23:25.660 "zone_management": false, 00:23:25.660 "zone_append": false, 00:23:25.660 "compare": false, 00:23:25.660 "compare_and_write": false, 00:23:25.660 "abort": true, 00:23:25.660 "seek_hole": false, 00:23:25.660 "seek_data": false, 00:23:25.660 "copy": true, 00:23:25.660 "nvme_iov_md": false 00:23:25.660 }, 00:23:25.660 "memory_domains": [ 00:23:25.660 { 00:23:25.660 "dma_device_id": "system", 00:23:25.660 "dma_device_type": 1 00:23:25.660 }, 00:23:25.660 { 00:23:25.660 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:25.660 "dma_device_type": 2 00:23:25.660 } 00:23:25.660 ], 00:23:25.660 "driver_specific": {} 00:23:25.660 } 00:23:25.660 ] 00:23:25.660 12:04:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:25.660 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:23:25.660 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:23:25.660 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:23:25.919 [2024-07-15 12:04:39.397815] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:23:25.919 [2024-07-15 12:04:39.397857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:23:25.919 [2024-07-15 12:04:39.397876] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:25.919 [2024-07-15 12:04:39.399246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:25.919 [2024-07-15 12:04:39.399287] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.919 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:26.177 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:26.177 "name": "Existed_Raid", 00:23:26.177 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:26.177 "strip_size_kb": 0, 00:23:26.177 "state": "configuring", 00:23:26.177 "raid_level": "raid1", 00:23:26.177 "superblock": true, 00:23:26.177 "num_base_bdevs": 4, 00:23:26.177 "num_base_bdevs_discovered": 3, 00:23:26.177 "num_base_bdevs_operational": 4, 00:23:26.177 "base_bdevs_list": [ 00:23:26.177 { 00:23:26.177 "name": "BaseBdev1", 00:23:26.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:26.177 "is_configured": false, 00:23:26.177 "data_offset": 0, 00:23:26.177 "data_size": 0 00:23:26.177 }, 00:23:26.177 { 00:23:26.177 "name": "BaseBdev2", 00:23:26.177 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:26.177 "is_configured": true, 00:23:26.177 "data_offset": 2048, 00:23:26.177 "data_size": 63488 00:23:26.177 }, 00:23:26.177 { 00:23:26.178 "name": "BaseBdev3", 00:23:26.178 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:26.178 "is_configured": true, 00:23:26.178 "data_offset": 2048, 00:23:26.178 "data_size": 63488 00:23:26.178 }, 00:23:26.178 { 00:23:26.178 "name": "BaseBdev4", 00:23:26.178 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:26.178 "is_configured": true, 00:23:26.178 "data_offset": 2048, 00:23:26.178 "data_size": 63488 00:23:26.178 } 00:23:26.178 ] 00:23:26.178 }' 00:23:26.178 12:04:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:26.178 12:04:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:26.744 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:27.003 [2024-07-15 12:04:40.492697] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:27.003 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:27.262 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.262 "name": "Existed_Raid", 00:23:27.262 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:27.262 "strip_size_kb": 0, 00:23:27.262 "state": "configuring", 00:23:27.262 "raid_level": "raid1", 00:23:27.262 "superblock": true, 00:23:27.262 "num_base_bdevs": 4, 00:23:27.262 "num_base_bdevs_discovered": 2, 00:23:27.262 "num_base_bdevs_operational": 4, 00:23:27.262 "base_bdevs_list": [ 00:23:27.262 { 00:23:27.262 "name": "BaseBdev1", 00:23:27.262 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.262 "is_configured": false, 00:23:27.262 "data_offset": 0, 00:23:27.262 "data_size": 0 00:23:27.262 }, 00:23:27.262 { 00:23:27.262 "name": null, 00:23:27.262 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:27.262 "is_configured": false, 00:23:27.262 "data_offset": 2048, 00:23:27.262 "data_size": 63488 00:23:27.262 }, 00:23:27.262 { 00:23:27.262 "name": "BaseBdev3", 00:23:27.262 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:27.262 "is_configured": true, 00:23:27.262 "data_offset": 2048, 00:23:27.262 "data_size": 63488 00:23:27.262 }, 00:23:27.262 { 00:23:27.262 "name": "BaseBdev4", 00:23:27.262 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:27.262 "is_configured": true, 00:23:27.262 "data_offset": 2048, 00:23:27.262 "data_size": 63488 00:23:27.262 } 00:23:27.262 ] 00:23:27.262 }' 00:23:27.262 12:04:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.262 12:04:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:28.198 12:04:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.198 12:04:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:28.456 12:04:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:23:28.456 12:04:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:23:28.716 [2024-07-15 12:04:42.108504] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:28.716 BaseBdev1 00:23:28.716 12:04:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:23:28.716 12:04:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:23:28.716 12:04:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:28.716 12:04:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:28.716 12:04:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:28.716 12:04:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:28.716 12:04:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:29.284 12:04:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:23:29.543 [ 00:23:29.543 { 00:23:29.543 "name": "BaseBdev1", 00:23:29.543 "aliases": [ 00:23:29.543 "2ec04dc7-8a38-4252-a34c-d7b21da1db2a" 00:23:29.543 ], 00:23:29.543 "product_name": "Malloc disk", 00:23:29.543 "block_size": 512, 00:23:29.543 "num_blocks": 65536, 00:23:29.543 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:29.543 "assigned_rate_limits": { 00:23:29.543 "rw_ios_per_sec": 0, 00:23:29.543 "rw_mbytes_per_sec": 0, 00:23:29.543 "r_mbytes_per_sec": 0, 00:23:29.543 "w_mbytes_per_sec": 0 00:23:29.543 }, 00:23:29.543 "claimed": true, 00:23:29.543 "claim_type": "exclusive_write", 00:23:29.543 "zoned": false, 00:23:29.543 "supported_io_types": { 00:23:29.543 "read": true, 00:23:29.543 "write": true, 00:23:29.543 "unmap": true, 00:23:29.543 "flush": true, 00:23:29.543 "reset": true, 00:23:29.543 "nvme_admin": false, 00:23:29.543 "nvme_io": false, 00:23:29.543 "nvme_io_md": false, 00:23:29.543 "write_zeroes": true, 00:23:29.543 "zcopy": true, 00:23:29.543 "get_zone_info": false, 00:23:29.543 "zone_management": false, 00:23:29.543 "zone_append": false, 00:23:29.543 "compare": false, 00:23:29.543 "compare_and_write": false, 00:23:29.543 "abort": true, 00:23:29.543 "seek_hole": false, 00:23:29.543 "seek_data": false, 00:23:29.543 "copy": true, 00:23:29.543 "nvme_iov_md": false 00:23:29.543 }, 00:23:29.543 "memory_domains": [ 00:23:29.543 { 00:23:29.543 "dma_device_id": "system", 00:23:29.543 "dma_device_type": 1 00:23:29.543 }, 00:23:29.543 { 00:23:29.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:29.543 "dma_device_type": 2 00:23:29.543 } 00:23:29.543 ], 00:23:29.543 "driver_specific": {} 00:23:29.543 } 00:23:29.543 ] 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.802 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:30.368 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.368 "name": "Existed_Raid", 00:23:30.368 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:30.368 "strip_size_kb": 0, 00:23:30.368 "state": "configuring", 00:23:30.368 "raid_level": "raid1", 00:23:30.368 "superblock": true, 00:23:30.368 "num_base_bdevs": 4, 00:23:30.368 "num_base_bdevs_discovered": 3, 00:23:30.369 "num_base_bdevs_operational": 4, 00:23:30.369 "base_bdevs_list": [ 00:23:30.369 { 00:23:30.369 "name": "BaseBdev1", 00:23:30.369 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:30.369 "is_configured": true, 00:23:30.369 "data_offset": 2048, 00:23:30.369 "data_size": 63488 00:23:30.369 }, 00:23:30.369 { 00:23:30.369 "name": null, 00:23:30.369 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:30.369 "is_configured": false, 00:23:30.369 "data_offset": 2048, 00:23:30.369 "data_size": 63488 00:23:30.369 }, 00:23:30.369 { 00:23:30.369 "name": "BaseBdev3", 00:23:30.369 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:30.369 "is_configured": true, 00:23:30.369 "data_offset": 2048, 00:23:30.369 "data_size": 63488 00:23:30.369 }, 00:23:30.369 { 00:23:30.369 "name": "BaseBdev4", 00:23:30.369 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:30.369 "is_configured": true, 00:23:30.369 "data_offset": 2048, 00:23:30.369 "data_size": 63488 00:23:30.369 } 00:23:30.369 ] 00:23:30.369 }' 00:23:30.369 12:04:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.369 12:04:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:30.935 12:04:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.935 12:04:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:31.193 12:04:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:23:31.193 12:04:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:23:31.451 [2024-07-15 12:04:44.992196] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:31.451 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.710 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:31.710 "name": "Existed_Raid", 00:23:31.710 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:31.710 "strip_size_kb": 0, 00:23:31.710 "state": "configuring", 00:23:31.710 "raid_level": "raid1", 00:23:31.710 "superblock": true, 00:23:31.710 "num_base_bdevs": 4, 00:23:31.710 "num_base_bdevs_discovered": 2, 00:23:31.710 "num_base_bdevs_operational": 4, 00:23:31.710 "base_bdevs_list": [ 00:23:31.710 { 00:23:31.710 "name": "BaseBdev1", 00:23:31.710 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:31.711 "is_configured": true, 00:23:31.711 "data_offset": 2048, 00:23:31.711 "data_size": 63488 00:23:31.711 }, 00:23:31.711 { 00:23:31.711 "name": null, 00:23:31.711 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:31.711 "is_configured": false, 00:23:31.711 "data_offset": 2048, 00:23:31.711 "data_size": 63488 00:23:31.711 }, 00:23:31.711 { 00:23:31.711 "name": null, 00:23:31.711 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:31.711 "is_configured": false, 00:23:31.711 "data_offset": 2048, 00:23:31.711 "data_size": 63488 00:23:31.711 }, 00:23:31.711 { 00:23:31.711 "name": "BaseBdev4", 00:23:31.711 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:31.711 "is_configured": true, 00:23:31.711 "data_offset": 2048, 00:23:31.711 "data_size": 63488 00:23:31.711 } 00:23:31.711 ] 00:23:31.711 }' 00:23:31.711 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:31.711 12:04:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:32.278 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:32.278 12:04:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.537 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:23:32.537 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:23:32.796 [2024-07-15 12:04:46.323774] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.796 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:33.055 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:33.055 "name": "Existed_Raid", 00:23:33.055 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:33.055 "strip_size_kb": 0, 00:23:33.055 "state": "configuring", 00:23:33.055 "raid_level": "raid1", 00:23:33.055 "superblock": true, 00:23:33.055 "num_base_bdevs": 4, 00:23:33.055 "num_base_bdevs_discovered": 3, 00:23:33.055 "num_base_bdevs_operational": 4, 00:23:33.055 "base_bdevs_list": [ 00:23:33.055 { 00:23:33.055 "name": "BaseBdev1", 00:23:33.055 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:33.055 "is_configured": true, 00:23:33.055 "data_offset": 2048, 00:23:33.055 "data_size": 63488 00:23:33.055 }, 00:23:33.055 { 00:23:33.055 "name": null, 00:23:33.055 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:33.056 "is_configured": false, 00:23:33.056 "data_offset": 2048, 00:23:33.056 "data_size": 63488 00:23:33.056 }, 00:23:33.056 { 00:23:33.056 "name": "BaseBdev3", 00:23:33.056 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:33.056 "is_configured": true, 00:23:33.056 "data_offset": 2048, 00:23:33.056 "data_size": 63488 00:23:33.056 }, 00:23:33.056 { 00:23:33.056 "name": "BaseBdev4", 00:23:33.056 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:33.056 "is_configured": true, 00:23:33.056 "data_offset": 2048, 00:23:33.056 "data_size": 63488 00:23:33.056 } 00:23:33.056 ] 00:23:33.056 }' 00:23:33.056 12:04:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:33.056 12:04:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:33.622 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.622 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:23:33.881 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:23:33.881 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:23:34.139 [2024-07-15 12:04:47.647293] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.139 12:04:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:34.706 12:04:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:34.706 "name": "Existed_Raid", 00:23:34.706 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:34.706 "strip_size_kb": 0, 00:23:34.706 "state": "configuring", 00:23:34.706 "raid_level": "raid1", 00:23:34.706 "superblock": true, 00:23:34.706 "num_base_bdevs": 4, 00:23:34.706 "num_base_bdevs_discovered": 2, 00:23:34.706 "num_base_bdevs_operational": 4, 00:23:34.706 "base_bdevs_list": [ 00:23:34.706 { 00:23:34.706 "name": null, 00:23:34.706 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:34.706 "is_configured": false, 00:23:34.706 "data_offset": 2048, 00:23:34.706 "data_size": 63488 00:23:34.706 }, 00:23:34.706 { 00:23:34.706 "name": null, 00:23:34.706 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:34.706 "is_configured": false, 00:23:34.706 "data_offset": 2048, 00:23:34.706 "data_size": 63488 00:23:34.706 }, 00:23:34.706 { 00:23:34.706 "name": "BaseBdev3", 00:23:34.706 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:34.706 "is_configured": true, 00:23:34.706 "data_offset": 2048, 00:23:34.706 "data_size": 63488 00:23:34.706 }, 00:23:34.706 { 00:23:34.706 "name": "BaseBdev4", 00:23:34.706 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:34.706 "is_configured": true, 00:23:34.706 "data_offset": 2048, 00:23:34.706 "data_size": 63488 00:23:34.706 } 00:23:34.706 ] 00:23:34.706 }' 00:23:34.706 12:04:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:34.706 12:04:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:35.274 12:04:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.274 12:04:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:23:35.533 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:23:35.533 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:23:35.792 [2024-07-15 12:04:49.287558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.792 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:36.051 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.051 "name": "Existed_Raid", 00:23:36.051 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:36.051 "strip_size_kb": 0, 00:23:36.051 "state": "configuring", 00:23:36.051 "raid_level": "raid1", 00:23:36.051 "superblock": true, 00:23:36.051 "num_base_bdevs": 4, 00:23:36.051 "num_base_bdevs_discovered": 3, 00:23:36.051 "num_base_bdevs_operational": 4, 00:23:36.051 "base_bdevs_list": [ 00:23:36.051 { 00:23:36.052 "name": null, 00:23:36.052 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:36.052 "is_configured": false, 00:23:36.052 "data_offset": 2048, 00:23:36.052 "data_size": 63488 00:23:36.052 }, 00:23:36.052 { 00:23:36.052 "name": "BaseBdev2", 00:23:36.052 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:36.052 "is_configured": true, 00:23:36.052 "data_offset": 2048, 00:23:36.052 "data_size": 63488 00:23:36.052 }, 00:23:36.052 { 00:23:36.052 "name": "BaseBdev3", 00:23:36.052 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:36.052 "is_configured": true, 00:23:36.052 "data_offset": 2048, 00:23:36.052 "data_size": 63488 00:23:36.052 }, 00:23:36.052 { 00:23:36.052 "name": "BaseBdev4", 00:23:36.052 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:36.052 "is_configured": true, 00:23:36.052 "data_offset": 2048, 00:23:36.052 "data_size": 63488 00:23:36.052 } 00:23:36.052 ] 00:23:36.052 }' 00:23:36.052 12:04:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.052 12:04:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:36.620 12:04:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.620 12:04:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:23:36.879 12:04:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:23:36.879 12:04:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.879 12:04:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:23:37.138 12:04:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2ec04dc7-8a38-4252-a34c-d7b21da1db2a 00:23:37.397 [2024-07-15 12:04:50.908151] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:23:37.397 [2024-07-15 12:04:50.908347] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c3e370 00:23:37.397 [2024-07-15 12:04:50.908361] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:37.397 [2024-07-15 12:04:50.908554] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c3f9b0 00:23:37.397 [2024-07-15 12:04:50.908705] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c3e370 00:23:37.397 [2024-07-15 12:04:50.908716] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c3e370 00:23:37.397 [2024-07-15 12:04:50.908826] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:37.397 NewBaseBdev 00:23:37.397 12:04:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:23:37.397 12:04:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@897 -- # local bdev_name=NewBaseBdev 00:23:37.397 12:04:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:23:37.397 12:04:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local i 00:23:37.397 12:04:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:23:37.397 12:04:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:23:37.397 12:04:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:23:37.656 12:04:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:23:37.915 [ 00:23:37.915 { 00:23:37.915 "name": "NewBaseBdev", 00:23:37.915 "aliases": [ 00:23:37.915 "2ec04dc7-8a38-4252-a34c-d7b21da1db2a" 00:23:37.915 ], 00:23:37.915 "product_name": "Malloc disk", 00:23:37.915 "block_size": 512, 00:23:37.915 "num_blocks": 65536, 00:23:37.915 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:37.915 "assigned_rate_limits": { 00:23:37.915 "rw_ios_per_sec": 0, 00:23:37.915 "rw_mbytes_per_sec": 0, 00:23:37.915 "r_mbytes_per_sec": 0, 00:23:37.915 "w_mbytes_per_sec": 0 00:23:37.915 }, 00:23:37.915 "claimed": true, 00:23:37.915 "claim_type": "exclusive_write", 00:23:37.915 "zoned": false, 00:23:37.915 "supported_io_types": { 00:23:37.915 "read": true, 00:23:37.915 "write": true, 00:23:37.915 "unmap": true, 00:23:37.915 "flush": true, 00:23:37.915 "reset": true, 00:23:37.915 "nvme_admin": false, 00:23:37.915 "nvme_io": false, 00:23:37.915 "nvme_io_md": false, 00:23:37.915 "write_zeroes": true, 00:23:37.915 "zcopy": true, 00:23:37.915 "get_zone_info": false, 00:23:37.915 "zone_management": false, 00:23:37.915 "zone_append": false, 00:23:37.915 "compare": false, 00:23:37.915 "compare_and_write": false, 00:23:37.915 "abort": true, 00:23:37.915 "seek_hole": false, 00:23:37.915 "seek_data": false, 00:23:37.915 "copy": true, 00:23:37.915 "nvme_iov_md": false 00:23:37.915 }, 00:23:37.915 "memory_domains": [ 00:23:37.915 { 00:23:37.915 "dma_device_id": "system", 00:23:37.915 "dma_device_type": 1 00:23:37.915 }, 00:23:37.915 { 00:23:37.915 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:37.915 "dma_device_type": 2 00:23:37.915 } 00:23:37.915 ], 00:23:37.915 "driver_specific": {} 00:23:37.915 } 00:23:37.915 ] 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # return 0 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.915 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:23:38.175 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:38.175 "name": "Existed_Raid", 00:23:38.175 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:38.175 "strip_size_kb": 0, 00:23:38.175 "state": "online", 00:23:38.175 "raid_level": "raid1", 00:23:38.175 "superblock": true, 00:23:38.175 "num_base_bdevs": 4, 00:23:38.175 "num_base_bdevs_discovered": 4, 00:23:38.175 "num_base_bdevs_operational": 4, 00:23:38.175 "base_bdevs_list": [ 00:23:38.175 { 00:23:38.175 "name": "NewBaseBdev", 00:23:38.175 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:38.175 "is_configured": true, 00:23:38.175 "data_offset": 2048, 00:23:38.175 "data_size": 63488 00:23:38.175 }, 00:23:38.175 { 00:23:38.175 "name": "BaseBdev2", 00:23:38.175 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:38.175 "is_configured": true, 00:23:38.175 "data_offset": 2048, 00:23:38.175 "data_size": 63488 00:23:38.175 }, 00:23:38.175 { 00:23:38.175 "name": "BaseBdev3", 00:23:38.175 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:38.175 "is_configured": true, 00:23:38.175 "data_offset": 2048, 00:23:38.175 "data_size": 63488 00:23:38.175 }, 00:23:38.175 { 00:23:38.175 "name": "BaseBdev4", 00:23:38.175 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:38.175 "is_configured": true, 00:23:38.175 "data_offset": 2048, 00:23:38.175 "data_size": 63488 00:23:38.175 } 00:23:38.175 ] 00:23:38.175 }' 00:23:38.175 12:04:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:38.175 12:04:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:38.742 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:23:38.742 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:23:38.742 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:38.742 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:38.742 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:38.742 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:23:38.742 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:23:38.742 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:39.000 [2024-07-15 12:04:52.484657] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:39.001 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:39.001 "name": "Existed_Raid", 00:23:39.001 "aliases": [ 00:23:39.001 "c7138823-92d3-463a-9c0c-ac32d19a820e" 00:23:39.001 ], 00:23:39.001 "product_name": "Raid Volume", 00:23:39.001 "block_size": 512, 00:23:39.001 "num_blocks": 63488, 00:23:39.001 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:39.001 "assigned_rate_limits": { 00:23:39.001 "rw_ios_per_sec": 0, 00:23:39.001 "rw_mbytes_per_sec": 0, 00:23:39.001 "r_mbytes_per_sec": 0, 00:23:39.001 "w_mbytes_per_sec": 0 00:23:39.001 }, 00:23:39.001 "claimed": false, 00:23:39.001 "zoned": false, 00:23:39.001 "supported_io_types": { 00:23:39.001 "read": true, 00:23:39.001 "write": true, 00:23:39.001 "unmap": false, 00:23:39.001 "flush": false, 00:23:39.001 "reset": true, 00:23:39.001 "nvme_admin": false, 00:23:39.001 "nvme_io": false, 00:23:39.001 "nvme_io_md": false, 00:23:39.001 "write_zeroes": true, 00:23:39.001 "zcopy": false, 00:23:39.001 "get_zone_info": false, 00:23:39.001 "zone_management": false, 00:23:39.001 "zone_append": false, 00:23:39.001 "compare": false, 00:23:39.001 "compare_and_write": false, 00:23:39.001 "abort": false, 00:23:39.001 "seek_hole": false, 00:23:39.001 "seek_data": false, 00:23:39.001 "copy": false, 00:23:39.001 "nvme_iov_md": false 00:23:39.001 }, 00:23:39.001 "memory_domains": [ 00:23:39.001 { 00:23:39.001 "dma_device_id": "system", 00:23:39.001 "dma_device_type": 1 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:39.001 "dma_device_type": 2 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "dma_device_id": "system", 00:23:39.001 "dma_device_type": 1 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:39.001 "dma_device_type": 2 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "dma_device_id": "system", 00:23:39.001 "dma_device_type": 1 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:39.001 "dma_device_type": 2 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "dma_device_id": "system", 00:23:39.001 "dma_device_type": 1 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:39.001 "dma_device_type": 2 00:23:39.001 } 00:23:39.001 ], 00:23:39.001 "driver_specific": { 00:23:39.001 "raid": { 00:23:39.001 "uuid": "c7138823-92d3-463a-9c0c-ac32d19a820e", 00:23:39.001 "strip_size_kb": 0, 00:23:39.001 "state": "online", 00:23:39.001 "raid_level": "raid1", 00:23:39.001 "superblock": true, 00:23:39.001 "num_base_bdevs": 4, 00:23:39.001 "num_base_bdevs_discovered": 4, 00:23:39.001 "num_base_bdevs_operational": 4, 00:23:39.001 "base_bdevs_list": [ 00:23:39.001 { 00:23:39.001 "name": "NewBaseBdev", 00:23:39.001 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:39.001 "is_configured": true, 00:23:39.001 "data_offset": 2048, 00:23:39.001 "data_size": 63488 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "name": "BaseBdev2", 00:23:39.001 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:39.001 "is_configured": true, 00:23:39.001 "data_offset": 2048, 00:23:39.001 "data_size": 63488 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "name": "BaseBdev3", 00:23:39.001 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:39.001 "is_configured": true, 00:23:39.001 "data_offset": 2048, 00:23:39.001 "data_size": 63488 00:23:39.001 }, 00:23:39.001 { 00:23:39.001 "name": "BaseBdev4", 00:23:39.001 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:39.001 "is_configured": true, 00:23:39.001 "data_offset": 2048, 00:23:39.001 "data_size": 63488 00:23:39.001 } 00:23:39.001 ] 00:23:39.001 } 00:23:39.001 } 00:23:39.001 }' 00:23:39.001 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:39.001 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:23:39.001 BaseBdev2 00:23:39.001 BaseBdev3 00:23:39.001 BaseBdev4' 00:23:39.001 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:39.001 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:23:39.001 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:39.258 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:39.258 "name": "NewBaseBdev", 00:23:39.258 "aliases": [ 00:23:39.258 "2ec04dc7-8a38-4252-a34c-d7b21da1db2a" 00:23:39.258 ], 00:23:39.258 "product_name": "Malloc disk", 00:23:39.258 "block_size": 512, 00:23:39.258 "num_blocks": 65536, 00:23:39.258 "uuid": "2ec04dc7-8a38-4252-a34c-d7b21da1db2a", 00:23:39.258 "assigned_rate_limits": { 00:23:39.258 "rw_ios_per_sec": 0, 00:23:39.258 "rw_mbytes_per_sec": 0, 00:23:39.258 "r_mbytes_per_sec": 0, 00:23:39.258 "w_mbytes_per_sec": 0 00:23:39.258 }, 00:23:39.258 "claimed": true, 00:23:39.258 "claim_type": "exclusive_write", 00:23:39.258 "zoned": false, 00:23:39.258 "supported_io_types": { 00:23:39.258 "read": true, 00:23:39.258 "write": true, 00:23:39.258 "unmap": true, 00:23:39.258 "flush": true, 00:23:39.258 "reset": true, 00:23:39.258 "nvme_admin": false, 00:23:39.258 "nvme_io": false, 00:23:39.258 "nvme_io_md": false, 00:23:39.258 "write_zeroes": true, 00:23:39.258 "zcopy": true, 00:23:39.258 "get_zone_info": false, 00:23:39.258 "zone_management": false, 00:23:39.258 "zone_append": false, 00:23:39.258 "compare": false, 00:23:39.258 "compare_and_write": false, 00:23:39.258 "abort": true, 00:23:39.258 "seek_hole": false, 00:23:39.258 "seek_data": false, 00:23:39.258 "copy": true, 00:23:39.258 "nvme_iov_md": false 00:23:39.258 }, 00:23:39.258 "memory_domains": [ 00:23:39.258 { 00:23:39.258 "dma_device_id": "system", 00:23:39.258 "dma_device_type": 1 00:23:39.258 }, 00:23:39.258 { 00:23:39.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:39.258 "dma_device_type": 2 00:23:39.258 } 00:23:39.258 ], 00:23:39.258 "driver_specific": {} 00:23:39.258 }' 00:23:39.258 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:39.516 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:39.516 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:39.516 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:39.516 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:39.516 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:39.516 12:04:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:39.516 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:39.516 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:39.516 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:39.773 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:39.773 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:39.773 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:39.773 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:23:39.773 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:40.031 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:40.031 "name": "BaseBdev2", 00:23:40.031 "aliases": [ 00:23:40.031 "8dd449ee-27d2-400c-8ff1-5296ee9605c9" 00:23:40.031 ], 00:23:40.031 "product_name": "Malloc disk", 00:23:40.031 "block_size": 512, 00:23:40.031 "num_blocks": 65536, 00:23:40.031 "uuid": "8dd449ee-27d2-400c-8ff1-5296ee9605c9", 00:23:40.031 "assigned_rate_limits": { 00:23:40.031 "rw_ios_per_sec": 0, 00:23:40.031 "rw_mbytes_per_sec": 0, 00:23:40.031 "r_mbytes_per_sec": 0, 00:23:40.031 "w_mbytes_per_sec": 0 00:23:40.031 }, 00:23:40.031 "claimed": true, 00:23:40.031 "claim_type": "exclusive_write", 00:23:40.031 "zoned": false, 00:23:40.031 "supported_io_types": { 00:23:40.031 "read": true, 00:23:40.031 "write": true, 00:23:40.031 "unmap": true, 00:23:40.031 "flush": true, 00:23:40.031 "reset": true, 00:23:40.031 "nvme_admin": false, 00:23:40.031 "nvme_io": false, 00:23:40.031 "nvme_io_md": false, 00:23:40.031 "write_zeroes": true, 00:23:40.031 "zcopy": true, 00:23:40.031 "get_zone_info": false, 00:23:40.031 "zone_management": false, 00:23:40.031 "zone_append": false, 00:23:40.031 "compare": false, 00:23:40.031 "compare_and_write": false, 00:23:40.031 "abort": true, 00:23:40.031 "seek_hole": false, 00:23:40.031 "seek_data": false, 00:23:40.031 "copy": true, 00:23:40.031 "nvme_iov_md": false 00:23:40.031 }, 00:23:40.031 "memory_domains": [ 00:23:40.031 { 00:23:40.031 "dma_device_id": "system", 00:23:40.031 "dma_device_type": 1 00:23:40.031 }, 00:23:40.031 { 00:23:40.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:40.031 "dma_device_type": 2 00:23:40.031 } 00:23:40.031 ], 00:23:40.031 "driver_specific": {} 00:23:40.031 }' 00:23:40.031 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:40.031 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:40.031 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:40.031 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:40.031 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:40.031 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:40.031 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:40.031 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:40.288 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:40.288 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:40.288 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:40.288 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:40.288 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:40.288 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:23:40.288 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:40.545 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:40.545 "name": "BaseBdev3", 00:23:40.545 "aliases": [ 00:23:40.545 "fa3d6c09-e3da-4f43-9143-bb81c02d53e6" 00:23:40.545 ], 00:23:40.545 "product_name": "Malloc disk", 00:23:40.545 "block_size": 512, 00:23:40.545 "num_blocks": 65536, 00:23:40.545 "uuid": "fa3d6c09-e3da-4f43-9143-bb81c02d53e6", 00:23:40.545 "assigned_rate_limits": { 00:23:40.545 "rw_ios_per_sec": 0, 00:23:40.545 "rw_mbytes_per_sec": 0, 00:23:40.545 "r_mbytes_per_sec": 0, 00:23:40.545 "w_mbytes_per_sec": 0 00:23:40.545 }, 00:23:40.545 "claimed": true, 00:23:40.545 "claim_type": "exclusive_write", 00:23:40.545 "zoned": false, 00:23:40.545 "supported_io_types": { 00:23:40.545 "read": true, 00:23:40.545 "write": true, 00:23:40.545 "unmap": true, 00:23:40.545 "flush": true, 00:23:40.545 "reset": true, 00:23:40.545 "nvme_admin": false, 00:23:40.545 "nvme_io": false, 00:23:40.545 "nvme_io_md": false, 00:23:40.545 "write_zeroes": true, 00:23:40.545 "zcopy": true, 00:23:40.545 "get_zone_info": false, 00:23:40.545 "zone_management": false, 00:23:40.545 "zone_append": false, 00:23:40.545 "compare": false, 00:23:40.545 "compare_and_write": false, 00:23:40.545 "abort": true, 00:23:40.545 "seek_hole": false, 00:23:40.545 "seek_data": false, 00:23:40.545 "copy": true, 00:23:40.545 "nvme_iov_md": false 00:23:40.545 }, 00:23:40.545 "memory_domains": [ 00:23:40.545 { 00:23:40.545 "dma_device_id": "system", 00:23:40.545 "dma_device_type": 1 00:23:40.545 }, 00:23:40.545 { 00:23:40.545 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:40.545 "dma_device_type": 2 00:23:40.545 } 00:23:40.545 ], 00:23:40.545 "driver_specific": {} 00:23:40.545 }' 00:23:40.545 12:04:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:40.545 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:40.545 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:40.545 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:40.545 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:23:40.803 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:41.062 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:41.062 "name": "BaseBdev4", 00:23:41.062 "aliases": [ 00:23:41.062 "796d2778-4481-48d8-b032-4ebc69e10908" 00:23:41.062 ], 00:23:41.062 "product_name": "Malloc disk", 00:23:41.062 "block_size": 512, 00:23:41.062 "num_blocks": 65536, 00:23:41.062 "uuid": "796d2778-4481-48d8-b032-4ebc69e10908", 00:23:41.062 "assigned_rate_limits": { 00:23:41.062 "rw_ios_per_sec": 0, 00:23:41.062 "rw_mbytes_per_sec": 0, 00:23:41.062 "r_mbytes_per_sec": 0, 00:23:41.062 "w_mbytes_per_sec": 0 00:23:41.062 }, 00:23:41.062 "claimed": true, 00:23:41.062 "claim_type": "exclusive_write", 00:23:41.062 "zoned": false, 00:23:41.062 "supported_io_types": { 00:23:41.062 "read": true, 00:23:41.062 "write": true, 00:23:41.062 "unmap": true, 00:23:41.062 "flush": true, 00:23:41.062 "reset": true, 00:23:41.062 "nvme_admin": false, 00:23:41.062 "nvme_io": false, 00:23:41.062 "nvme_io_md": false, 00:23:41.062 "write_zeroes": true, 00:23:41.062 "zcopy": true, 00:23:41.062 "get_zone_info": false, 00:23:41.062 "zone_management": false, 00:23:41.062 "zone_append": false, 00:23:41.062 "compare": false, 00:23:41.062 "compare_and_write": false, 00:23:41.062 "abort": true, 00:23:41.062 "seek_hole": false, 00:23:41.062 "seek_data": false, 00:23:41.062 "copy": true, 00:23:41.062 "nvme_iov_md": false 00:23:41.062 }, 00:23:41.062 "memory_domains": [ 00:23:41.062 { 00:23:41.062 "dma_device_id": "system", 00:23:41.062 "dma_device_type": 1 00:23:41.062 }, 00:23:41.062 { 00:23:41.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:41.062 "dma_device_type": 2 00:23:41.062 } 00:23:41.062 ], 00:23:41.062 "driver_specific": {} 00:23:41.062 }' 00:23:41.062 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:41.062 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:41.321 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:41.321 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:41.321 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:41.321 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:41.321 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:41.321 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:41.321 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:41.321 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:41.580 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:41.580 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:41.580 12:04:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:23:41.840 [2024-07-15 12:04:55.187605] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:23:41.840 [2024-07-15 12:04:55.187638] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:41.840 [2024-07-15 12:04:55.187712] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:41.840 [2024-07-15 12:04:55.188019] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:41.840 [2024-07-15 12:04:55.188033] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c3e370 name Existed_Raid, state offline 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 1552311 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1552311 ']' 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@952 -- # kill -0 1552311 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # uname 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1552311 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1552311' 00:23:41.840 killing process with pid 1552311 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # kill 1552311 00:23:41.840 [2024-07-15 12:04:55.256423] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:41.840 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@972 -- # wait 1552311 00:23:41.840 [2024-07-15 12:04:55.344722] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:42.491 12:04:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:23:42.491 00:23:42.491 real 0m35.233s 00:23:42.491 user 1m4.555s 00:23:42.491 sys 0m6.209s 00:23:42.491 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:23:42.491 12:04:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:42.491 ************************************ 00:23:42.491 END TEST raid_state_function_test_sb 00:23:42.491 ************************************ 00:23:42.491 12:04:55 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:23:42.491 12:04:55 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:23:42.491 12:04:55 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:23:42.491 12:04:55 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:23:42.491 12:04:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:42.492 ************************************ 00:23:42.492 START TEST raid_superblock_test 00:23:42.492 ************************************ 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 4 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=1557368 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 1557368 /var/tmp/spdk-raid.sock 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@829 -- # '[' -z 1557368 ']' 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:42.492 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:23:42.492 12:04:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:42.492 [2024-07-15 12:04:55.882084] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:23:42.492 [2024-07-15 12:04:55.882154] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1557368 ] 00:23:42.492 [2024-07-15 12:04:56.011610] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:42.751 [2024-07-15 12:04:56.117139] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:42.751 [2024-07-15 12:04:56.172234] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:42.751 [2024-07-15 12:04:56.172263] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@862 -- # return 0 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:43.689 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:23:43.948 malloc1 00:23:43.948 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:44.207 [2024-07-15 12:04:57.565147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:44.208 [2024-07-15 12:04:57.565199] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:44.208 [2024-07-15 12:04:57.565220] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe32560 00:23:44.208 [2024-07-15 12:04:57.565233] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:44.208 [2024-07-15 12:04:57.566832] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:44.208 [2024-07-15 12:04:57.566859] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:44.208 pt1 00:23:44.208 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:44.208 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:44.208 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:23:44.208 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:23:44.208 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:23:44.208 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:44.208 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:44.208 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:44.208 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:23:44.467 malloc2 00:23:44.467 12:04:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:45.035 [2024-07-15 12:04:58.339804] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:45.035 [2024-07-15 12:04:58.339853] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:45.035 [2024-07-15 12:04:58.339871] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed05b0 00:23:45.035 [2024-07-15 12:04:58.339884] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:45.035 [2024-07-15 12:04:58.341470] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:45.035 [2024-07-15 12:04:58.341498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:45.035 pt2 00:23:45.035 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:45.035 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:45.035 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:23:45.035 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:23:45.035 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:23:45.035 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:45.035 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:45.035 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:45.035 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:23:45.035 malloc3 00:23:45.036 12:04:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:45.604 [2024-07-15 12:04:59.111652] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:45.604 [2024-07-15 12:04:59.111710] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:45.604 [2024-07-15 12:04:59.111729] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed0be0 00:23:45.604 [2024-07-15 12:04:59.111742] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:45.604 [2024-07-15 12:04:59.113310] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:45.604 [2024-07-15 12:04:59.113337] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:45.604 pt3 00:23:45.604 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:45.604 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:45.604 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:23:45.604 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:23:45.604 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:23:45.604 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:23:45.604 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:23:45.604 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:23:45.604 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:23:45.863 malloc4 00:23:45.863 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:46.431 [2024-07-15 12:04:59.882236] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:46.431 [2024-07-15 12:04:59.882285] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.431 [2024-07-15 12:04:59.882302] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed3f00 00:23:46.431 [2024-07-15 12:04:59.882315] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.431 [2024-07-15 12:04:59.883869] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.431 [2024-07-15 12:04:59.883897] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:46.431 pt4 00:23:46.432 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:23:46.432 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:23:46.432 12:04:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:23:46.999 [2024-07-15 12:05:00.395601] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:46.999 [2024-07-15 12:05:00.396995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:46.999 [2024-07-15 12:05:00.397058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:46.999 [2024-07-15 12:05:00.397101] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:46.999 [2024-07-15 12:05:00.397276] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xed3880 00:23:46.999 [2024-07-15 12:05:00.397287] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:46.999 [2024-07-15 12:05:00.397494] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xed3850 00:23:46.999 [2024-07-15 12:05:00.397650] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xed3880 00:23:46.999 [2024-07-15 12:05:00.397660] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xed3880 00:23:46.999 [2024-07-15 12:05:00.397778] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.999 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.258 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:47.258 "name": "raid_bdev1", 00:23:47.258 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:23:47.258 "strip_size_kb": 0, 00:23:47.258 "state": "online", 00:23:47.258 "raid_level": "raid1", 00:23:47.258 "superblock": true, 00:23:47.258 "num_base_bdevs": 4, 00:23:47.258 "num_base_bdevs_discovered": 4, 00:23:47.258 "num_base_bdevs_operational": 4, 00:23:47.258 "base_bdevs_list": [ 00:23:47.258 { 00:23:47.258 "name": "pt1", 00:23:47.258 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:47.258 "is_configured": true, 00:23:47.258 "data_offset": 2048, 00:23:47.258 "data_size": 63488 00:23:47.258 }, 00:23:47.258 { 00:23:47.258 "name": "pt2", 00:23:47.258 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:47.258 "is_configured": true, 00:23:47.258 "data_offset": 2048, 00:23:47.258 "data_size": 63488 00:23:47.258 }, 00:23:47.258 { 00:23:47.258 "name": "pt3", 00:23:47.258 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:47.258 "is_configured": true, 00:23:47.258 "data_offset": 2048, 00:23:47.258 "data_size": 63488 00:23:47.258 }, 00:23:47.258 { 00:23:47.258 "name": "pt4", 00:23:47.258 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:47.258 "is_configured": true, 00:23:47.258 "data_offset": 2048, 00:23:47.258 "data_size": 63488 00:23:47.258 } 00:23:47.258 ] 00:23:47.258 }' 00:23:47.258 12:05:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:47.258 12:05:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:47.834 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:23:47.834 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:47.834 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:47.834 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:47.834 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:47.834 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:47.834 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:47.834 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:48.093 [2024-07-15 12:05:01.519208] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:48.093 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:48.093 "name": "raid_bdev1", 00:23:48.093 "aliases": [ 00:23:48.093 "71598cfa-53fa-4833-b24b-05bbcb76748b" 00:23:48.093 ], 00:23:48.093 "product_name": "Raid Volume", 00:23:48.093 "block_size": 512, 00:23:48.093 "num_blocks": 63488, 00:23:48.093 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:23:48.093 "assigned_rate_limits": { 00:23:48.093 "rw_ios_per_sec": 0, 00:23:48.093 "rw_mbytes_per_sec": 0, 00:23:48.093 "r_mbytes_per_sec": 0, 00:23:48.093 "w_mbytes_per_sec": 0 00:23:48.093 }, 00:23:48.093 "claimed": false, 00:23:48.093 "zoned": false, 00:23:48.093 "supported_io_types": { 00:23:48.093 "read": true, 00:23:48.093 "write": true, 00:23:48.093 "unmap": false, 00:23:48.093 "flush": false, 00:23:48.093 "reset": true, 00:23:48.093 "nvme_admin": false, 00:23:48.093 "nvme_io": false, 00:23:48.093 "nvme_io_md": false, 00:23:48.093 "write_zeroes": true, 00:23:48.093 "zcopy": false, 00:23:48.093 "get_zone_info": false, 00:23:48.093 "zone_management": false, 00:23:48.093 "zone_append": false, 00:23:48.093 "compare": false, 00:23:48.093 "compare_and_write": false, 00:23:48.093 "abort": false, 00:23:48.093 "seek_hole": false, 00:23:48.093 "seek_data": false, 00:23:48.093 "copy": false, 00:23:48.093 "nvme_iov_md": false 00:23:48.093 }, 00:23:48.093 "memory_domains": [ 00:23:48.093 { 00:23:48.093 "dma_device_id": "system", 00:23:48.093 "dma_device_type": 1 00:23:48.093 }, 00:23:48.093 { 00:23:48.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.093 "dma_device_type": 2 00:23:48.093 }, 00:23:48.093 { 00:23:48.093 "dma_device_id": "system", 00:23:48.093 "dma_device_type": 1 00:23:48.093 }, 00:23:48.093 { 00:23:48.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.093 "dma_device_type": 2 00:23:48.093 }, 00:23:48.093 { 00:23:48.093 "dma_device_id": "system", 00:23:48.093 "dma_device_type": 1 00:23:48.093 }, 00:23:48.093 { 00:23:48.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.093 "dma_device_type": 2 00:23:48.093 }, 00:23:48.093 { 00:23:48.093 "dma_device_id": "system", 00:23:48.093 "dma_device_type": 1 00:23:48.093 }, 00:23:48.093 { 00:23:48.093 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.093 "dma_device_type": 2 00:23:48.093 } 00:23:48.093 ], 00:23:48.093 "driver_specific": { 00:23:48.093 "raid": { 00:23:48.093 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:23:48.093 "strip_size_kb": 0, 00:23:48.093 "state": "online", 00:23:48.093 "raid_level": "raid1", 00:23:48.093 "superblock": true, 00:23:48.093 "num_base_bdevs": 4, 00:23:48.093 "num_base_bdevs_discovered": 4, 00:23:48.093 "num_base_bdevs_operational": 4, 00:23:48.093 "base_bdevs_list": [ 00:23:48.093 { 00:23:48.093 "name": "pt1", 00:23:48.093 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:48.093 "is_configured": true, 00:23:48.093 "data_offset": 2048, 00:23:48.093 "data_size": 63488 00:23:48.093 }, 00:23:48.093 { 00:23:48.093 "name": "pt2", 00:23:48.094 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:48.094 "is_configured": true, 00:23:48.094 "data_offset": 2048, 00:23:48.094 "data_size": 63488 00:23:48.094 }, 00:23:48.094 { 00:23:48.094 "name": "pt3", 00:23:48.094 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:48.094 "is_configured": true, 00:23:48.094 "data_offset": 2048, 00:23:48.094 "data_size": 63488 00:23:48.094 }, 00:23:48.094 { 00:23:48.094 "name": "pt4", 00:23:48.094 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:48.094 "is_configured": true, 00:23:48.094 "data_offset": 2048, 00:23:48.094 "data_size": 63488 00:23:48.094 } 00:23:48.094 ] 00:23:48.094 } 00:23:48.094 } 00:23:48.094 }' 00:23:48.094 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:48.094 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:48.094 pt2 00:23:48.094 pt3 00:23:48.094 pt4' 00:23:48.094 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:48.094 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:48.094 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:48.353 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:48.353 "name": "pt1", 00:23:48.353 "aliases": [ 00:23:48.353 "00000000-0000-0000-0000-000000000001" 00:23:48.353 ], 00:23:48.353 "product_name": "passthru", 00:23:48.353 "block_size": 512, 00:23:48.353 "num_blocks": 65536, 00:23:48.353 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:48.353 "assigned_rate_limits": { 00:23:48.353 "rw_ios_per_sec": 0, 00:23:48.353 "rw_mbytes_per_sec": 0, 00:23:48.353 "r_mbytes_per_sec": 0, 00:23:48.353 "w_mbytes_per_sec": 0 00:23:48.353 }, 00:23:48.353 "claimed": true, 00:23:48.353 "claim_type": "exclusive_write", 00:23:48.353 "zoned": false, 00:23:48.353 "supported_io_types": { 00:23:48.353 "read": true, 00:23:48.353 "write": true, 00:23:48.353 "unmap": true, 00:23:48.353 "flush": true, 00:23:48.353 "reset": true, 00:23:48.353 "nvme_admin": false, 00:23:48.353 "nvme_io": false, 00:23:48.353 "nvme_io_md": false, 00:23:48.353 "write_zeroes": true, 00:23:48.353 "zcopy": true, 00:23:48.353 "get_zone_info": false, 00:23:48.353 "zone_management": false, 00:23:48.353 "zone_append": false, 00:23:48.353 "compare": false, 00:23:48.353 "compare_and_write": false, 00:23:48.353 "abort": true, 00:23:48.353 "seek_hole": false, 00:23:48.353 "seek_data": false, 00:23:48.353 "copy": true, 00:23:48.353 "nvme_iov_md": false 00:23:48.353 }, 00:23:48.353 "memory_domains": [ 00:23:48.353 { 00:23:48.353 "dma_device_id": "system", 00:23:48.353 "dma_device_type": 1 00:23:48.353 }, 00:23:48.353 { 00:23:48.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.353 "dma_device_type": 2 00:23:48.353 } 00:23:48.353 ], 00:23:48.353 "driver_specific": { 00:23:48.353 "passthru": { 00:23:48.353 "name": "pt1", 00:23:48.353 "base_bdev_name": "malloc1" 00:23:48.353 } 00:23:48.353 } 00:23:48.353 }' 00:23:48.353 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:48.353 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:48.353 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:48.353 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:48.612 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:48.612 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:48.612 12:05:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:48.612 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:48.612 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:48.612 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:48.612 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:48.612 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:48.612 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:48.612 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:48.612 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:48.871 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:48.871 "name": "pt2", 00:23:48.871 "aliases": [ 00:23:48.871 "00000000-0000-0000-0000-000000000002" 00:23:48.871 ], 00:23:48.871 "product_name": "passthru", 00:23:48.871 "block_size": 512, 00:23:48.871 "num_blocks": 65536, 00:23:48.871 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:48.871 "assigned_rate_limits": { 00:23:48.871 "rw_ios_per_sec": 0, 00:23:48.871 "rw_mbytes_per_sec": 0, 00:23:48.871 "r_mbytes_per_sec": 0, 00:23:48.871 "w_mbytes_per_sec": 0 00:23:48.871 }, 00:23:48.871 "claimed": true, 00:23:48.871 "claim_type": "exclusive_write", 00:23:48.871 "zoned": false, 00:23:48.871 "supported_io_types": { 00:23:48.871 "read": true, 00:23:48.871 "write": true, 00:23:48.871 "unmap": true, 00:23:48.871 "flush": true, 00:23:48.871 "reset": true, 00:23:48.871 "nvme_admin": false, 00:23:48.871 "nvme_io": false, 00:23:48.871 "nvme_io_md": false, 00:23:48.871 "write_zeroes": true, 00:23:48.871 "zcopy": true, 00:23:48.871 "get_zone_info": false, 00:23:48.871 "zone_management": false, 00:23:48.871 "zone_append": false, 00:23:48.871 "compare": false, 00:23:48.871 "compare_and_write": false, 00:23:48.871 "abort": true, 00:23:48.871 "seek_hole": false, 00:23:48.871 "seek_data": false, 00:23:48.871 "copy": true, 00:23:48.871 "nvme_iov_md": false 00:23:48.871 }, 00:23:48.871 "memory_domains": [ 00:23:48.871 { 00:23:48.871 "dma_device_id": "system", 00:23:48.871 "dma_device_type": 1 00:23:48.871 }, 00:23:48.871 { 00:23:48.871 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:48.871 "dma_device_type": 2 00:23:48.871 } 00:23:48.871 ], 00:23:48.871 "driver_specific": { 00:23:48.871 "passthru": { 00:23:48.871 "name": "pt2", 00:23:48.871 "base_bdev_name": "malloc2" 00:23:48.871 } 00:23:48.871 } 00:23:48.871 }' 00:23:48.871 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:49.130 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:49.130 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:49.130 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:49.130 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:49.130 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:49.130 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:49.130 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:49.130 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:49.130 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:49.389 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:49.389 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:49.389 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:49.389 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:49.389 12:05:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:49.648 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:49.648 "name": "pt3", 00:23:49.648 "aliases": [ 00:23:49.648 "00000000-0000-0000-0000-000000000003" 00:23:49.648 ], 00:23:49.648 "product_name": "passthru", 00:23:49.648 "block_size": 512, 00:23:49.648 "num_blocks": 65536, 00:23:49.648 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:49.648 "assigned_rate_limits": { 00:23:49.648 "rw_ios_per_sec": 0, 00:23:49.648 "rw_mbytes_per_sec": 0, 00:23:49.648 "r_mbytes_per_sec": 0, 00:23:49.648 "w_mbytes_per_sec": 0 00:23:49.648 }, 00:23:49.648 "claimed": true, 00:23:49.648 "claim_type": "exclusive_write", 00:23:49.648 "zoned": false, 00:23:49.648 "supported_io_types": { 00:23:49.648 "read": true, 00:23:49.648 "write": true, 00:23:49.648 "unmap": true, 00:23:49.648 "flush": true, 00:23:49.648 "reset": true, 00:23:49.648 "nvme_admin": false, 00:23:49.648 "nvme_io": false, 00:23:49.648 "nvme_io_md": false, 00:23:49.648 "write_zeroes": true, 00:23:49.648 "zcopy": true, 00:23:49.648 "get_zone_info": false, 00:23:49.648 "zone_management": false, 00:23:49.648 "zone_append": false, 00:23:49.648 "compare": false, 00:23:49.648 "compare_and_write": false, 00:23:49.648 "abort": true, 00:23:49.648 "seek_hole": false, 00:23:49.648 "seek_data": false, 00:23:49.648 "copy": true, 00:23:49.648 "nvme_iov_md": false 00:23:49.648 }, 00:23:49.648 "memory_domains": [ 00:23:49.648 { 00:23:49.648 "dma_device_id": "system", 00:23:49.648 "dma_device_type": 1 00:23:49.648 }, 00:23:49.648 { 00:23:49.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:49.648 "dma_device_type": 2 00:23:49.648 } 00:23:49.648 ], 00:23:49.648 "driver_specific": { 00:23:49.648 "passthru": { 00:23:49.648 "name": "pt3", 00:23:49.648 "base_bdev_name": "malloc3" 00:23:49.648 } 00:23:49.648 } 00:23:49.648 }' 00:23:49.648 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:49.648 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:49.648 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:49.648 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:49.648 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:49.648 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:49.648 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:49.907 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:49.907 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:49.907 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:49.907 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:49.907 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:49.907 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:49.907 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:49.907 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:50.166 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:50.166 "name": "pt4", 00:23:50.166 "aliases": [ 00:23:50.166 "00000000-0000-0000-0000-000000000004" 00:23:50.166 ], 00:23:50.166 "product_name": "passthru", 00:23:50.166 "block_size": 512, 00:23:50.166 "num_blocks": 65536, 00:23:50.166 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:50.166 "assigned_rate_limits": { 00:23:50.166 "rw_ios_per_sec": 0, 00:23:50.166 "rw_mbytes_per_sec": 0, 00:23:50.166 "r_mbytes_per_sec": 0, 00:23:50.166 "w_mbytes_per_sec": 0 00:23:50.166 }, 00:23:50.166 "claimed": true, 00:23:50.166 "claim_type": "exclusive_write", 00:23:50.166 "zoned": false, 00:23:50.166 "supported_io_types": { 00:23:50.166 "read": true, 00:23:50.166 "write": true, 00:23:50.166 "unmap": true, 00:23:50.166 "flush": true, 00:23:50.166 "reset": true, 00:23:50.166 "nvme_admin": false, 00:23:50.166 "nvme_io": false, 00:23:50.166 "nvme_io_md": false, 00:23:50.166 "write_zeroes": true, 00:23:50.166 "zcopy": true, 00:23:50.166 "get_zone_info": false, 00:23:50.166 "zone_management": false, 00:23:50.166 "zone_append": false, 00:23:50.166 "compare": false, 00:23:50.166 "compare_and_write": false, 00:23:50.166 "abort": true, 00:23:50.166 "seek_hole": false, 00:23:50.166 "seek_data": false, 00:23:50.166 "copy": true, 00:23:50.166 "nvme_iov_md": false 00:23:50.166 }, 00:23:50.166 "memory_domains": [ 00:23:50.166 { 00:23:50.166 "dma_device_id": "system", 00:23:50.166 "dma_device_type": 1 00:23:50.166 }, 00:23:50.166 { 00:23:50.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:50.166 "dma_device_type": 2 00:23:50.166 } 00:23:50.166 ], 00:23:50.166 "driver_specific": { 00:23:50.166 "passthru": { 00:23:50.166 "name": "pt4", 00:23:50.166 "base_bdev_name": "malloc4" 00:23:50.166 } 00:23:50.166 } 00:23:50.166 }' 00:23:50.166 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:50.166 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:50.166 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:50.166 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:50.425 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:50.425 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:50.425 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:50.425 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:50.425 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:50.425 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:50.425 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:50.425 12:05:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:50.425 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:23:50.425 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:50.684 [2024-07-15 12:05:04.230512] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:50.684 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=71598cfa-53fa-4833-b24b-05bbcb76748b 00:23:50.684 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 71598cfa-53fa-4833-b24b-05bbcb76748b ']' 00:23:50.684 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:50.942 [2024-07-15 12:05:04.482946] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:50.942 [2024-07-15 12:05:04.482971] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:50.942 [2024-07-15 12:05:04.483024] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:50.942 [2024-07-15 12:05:04.483109] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:50.942 [2024-07-15 12:05:04.483122] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xed3880 name raid_bdev1, state offline 00:23:50.942 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.942 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:23:51.201 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:23:51.201 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:23:51.201 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:51.201 12:05:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:51.460 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:51.461 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:51.719 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:51.719 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:23:51.979 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:23:51.979 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:23:52.238 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:23:52.238 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@648 -- # local es=0 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:52.496 12:05:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:23:52.756 [2024-07-15 12:05:06.215431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:23:52.756 [2024-07-15 12:05:06.216777] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:23:52.756 [2024-07-15 12:05:06.216820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:23:52.756 [2024-07-15 12:05:06.216854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:23:52.756 [2024-07-15 12:05:06.216896] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:23:52.756 [2024-07-15 12:05:06.216934] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:23:52.756 [2024-07-15 12:05:06.216957] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:23:52.756 [2024-07-15 12:05:06.216979] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:23:52.756 [2024-07-15 12:05:06.217003] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:52.756 [2024-07-15 12:05:06.217014] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xed4bf0 name raid_bdev1, state configuring 00:23:52.756 request: 00:23:52.756 { 00:23:52.756 "name": "raid_bdev1", 00:23:52.756 "raid_level": "raid1", 00:23:52.756 "base_bdevs": [ 00:23:52.756 "malloc1", 00:23:52.756 "malloc2", 00:23:52.756 "malloc3", 00:23:52.756 "malloc4" 00:23:52.756 ], 00:23:52.756 "superblock": false, 00:23:52.756 "method": "bdev_raid_create", 00:23:52.756 "req_id": 1 00:23:52.756 } 00:23:52.756 Got JSON-RPC error response 00:23:52.756 response: 00:23:52.756 { 00:23:52.756 "code": -17, 00:23:52.756 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:23:52.756 } 00:23:52.756 12:05:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # es=1 00:23:52.756 12:05:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:23:52.756 12:05:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:23:52.756 12:05:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:23:52.756 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.756 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:23:53.015 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:23:53.015 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:23:53.016 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:23:53.274 [2024-07-15 12:05:06.716712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:23:53.274 [2024-07-15 12:05:06.716761] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:53.274 [2024-07-15 12:05:06.716780] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe35110 00:23:53.274 [2024-07-15 12:05:06.716792] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:53.274 [2024-07-15 12:05:06.718386] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:53.274 [2024-07-15 12:05:06.718414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:23:53.274 [2024-07-15 12:05:06.718486] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:23:53.274 [2024-07-15 12:05:06.718513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:23:53.274 pt1 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:53.274 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:53.534 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:53.534 "name": "raid_bdev1", 00:23:53.534 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:23:53.534 "strip_size_kb": 0, 00:23:53.534 "state": "configuring", 00:23:53.534 "raid_level": "raid1", 00:23:53.534 "superblock": true, 00:23:53.534 "num_base_bdevs": 4, 00:23:53.534 "num_base_bdevs_discovered": 1, 00:23:53.534 "num_base_bdevs_operational": 4, 00:23:53.534 "base_bdevs_list": [ 00:23:53.534 { 00:23:53.534 "name": "pt1", 00:23:53.534 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:53.534 "is_configured": true, 00:23:53.534 "data_offset": 2048, 00:23:53.534 "data_size": 63488 00:23:53.534 }, 00:23:53.534 { 00:23:53.534 "name": null, 00:23:53.534 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:53.534 "is_configured": false, 00:23:53.534 "data_offset": 2048, 00:23:53.534 "data_size": 63488 00:23:53.534 }, 00:23:53.534 { 00:23:53.534 "name": null, 00:23:53.534 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:53.534 "is_configured": false, 00:23:53.534 "data_offset": 2048, 00:23:53.534 "data_size": 63488 00:23:53.534 }, 00:23:53.534 { 00:23:53.534 "name": null, 00:23:53.534 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:53.534 "is_configured": false, 00:23:53.534 "data_offset": 2048, 00:23:53.534 "data_size": 63488 00:23:53.534 } 00:23:53.534 ] 00:23:53.534 }' 00:23:53.534 12:05:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:53.534 12:05:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:54.104 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:23:54.104 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:54.364 [2024-07-15 12:05:07.751450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:54.364 [2024-07-15 12:05:07.751503] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:54.364 [2024-07-15 12:05:07.751524] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe29660 00:23:54.364 [2024-07-15 12:05:07.751537] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:54.364 [2024-07-15 12:05:07.751877] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:54.364 [2024-07-15 12:05:07.751894] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:54.364 [2024-07-15 12:05:07.751954] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:54.364 [2024-07-15 12:05:07.751972] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:54.364 pt2 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:23:54.364 [2024-07-15 12:05:07.931950] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.364 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.624 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.624 12:05:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.624 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:54.624 "name": "raid_bdev1", 00:23:54.624 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:23:54.624 "strip_size_kb": 0, 00:23:54.624 "state": "configuring", 00:23:54.624 "raid_level": "raid1", 00:23:54.624 "superblock": true, 00:23:54.624 "num_base_bdevs": 4, 00:23:54.624 "num_base_bdevs_discovered": 1, 00:23:54.624 "num_base_bdevs_operational": 4, 00:23:54.624 "base_bdevs_list": [ 00:23:54.624 { 00:23:54.624 "name": "pt1", 00:23:54.624 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:54.624 "is_configured": true, 00:23:54.624 "data_offset": 2048, 00:23:54.624 "data_size": 63488 00:23:54.624 }, 00:23:54.624 { 00:23:54.624 "name": null, 00:23:54.624 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:54.624 "is_configured": false, 00:23:54.624 "data_offset": 2048, 00:23:54.624 "data_size": 63488 00:23:54.624 }, 00:23:54.624 { 00:23:54.624 "name": null, 00:23:54.624 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:54.624 "is_configured": false, 00:23:54.624 "data_offset": 2048, 00:23:54.624 "data_size": 63488 00:23:54.624 }, 00:23:54.624 { 00:23:54.624 "name": null, 00:23:54.624 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:54.624 "is_configured": false, 00:23:54.624 "data_offset": 2048, 00:23:54.624 "data_size": 63488 00:23:54.624 } 00:23:54.624 ] 00:23:54.624 }' 00:23:54.624 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:54.624 12:05:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:55.192 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:23:55.192 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:55.192 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:23:55.451 [2024-07-15 12:05:08.802251] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:23:55.451 [2024-07-15 12:05:08.802301] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.451 [2024-07-15 12:05:08.802318] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe29890 00:23:55.451 [2024-07-15 12:05:08.802331] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.451 [2024-07-15 12:05:08.802656] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.451 [2024-07-15 12:05:08.802673] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:23:55.451 [2024-07-15 12:05:08.802736] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:23:55.451 [2024-07-15 12:05:08.802754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:23:55.451 pt2 00:23:55.451 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:55.451 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:55.452 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:23:55.452 [2024-07-15 12:05:08.970694] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:23:55.452 [2024-07-15 12:05:08.970729] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.452 [2024-07-15 12:05:08.970744] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed5780 00:23:55.452 [2024-07-15 12:05:08.970757] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.452 [2024-07-15 12:05:08.971041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.452 [2024-07-15 12:05:08.971058] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:23:55.452 [2024-07-15 12:05:08.971106] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:23:55.452 [2024-07-15 12:05:08.971123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:23:55.452 pt3 00:23:55.452 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:55.452 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:55.452 12:05:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:23:55.711 [2024-07-15 12:05:09.135119] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:23:55.711 [2024-07-15 12:05:09.135150] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:55.711 [2024-07-15 12:05:09.135165] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe343e0 00:23:55.711 [2024-07-15 12:05:09.135184] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:55.711 [2024-07-15 12:05:09.135449] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:55.711 [2024-07-15 12:05:09.135466] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:23:55.711 [2024-07-15 12:05:09.135512] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:23:55.711 [2024-07-15 12:05:09.135529] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:23:55.711 [2024-07-15 12:05:09.135640] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe28f60 00:23:55.711 [2024-07-15 12:05:09.135650] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:55.711 [2024-07-15 12:05:09.135819] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd4d030 00:23:55.711 [2024-07-15 12:05:09.135951] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe28f60 00:23:55.711 [2024-07-15 12:05:09.135961] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe28f60 00:23:55.711 [2024-07-15 12:05:09.136056] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:55.711 pt4 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.711 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.971 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.971 "name": "raid_bdev1", 00:23:55.971 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:23:55.971 "strip_size_kb": 0, 00:23:55.971 "state": "online", 00:23:55.971 "raid_level": "raid1", 00:23:55.971 "superblock": true, 00:23:55.971 "num_base_bdevs": 4, 00:23:55.971 "num_base_bdevs_discovered": 4, 00:23:55.971 "num_base_bdevs_operational": 4, 00:23:55.971 "base_bdevs_list": [ 00:23:55.971 { 00:23:55.971 "name": "pt1", 00:23:55.971 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:55.971 "is_configured": true, 00:23:55.971 "data_offset": 2048, 00:23:55.971 "data_size": 63488 00:23:55.971 }, 00:23:55.971 { 00:23:55.971 "name": "pt2", 00:23:55.971 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:55.971 "is_configured": true, 00:23:55.971 "data_offset": 2048, 00:23:55.971 "data_size": 63488 00:23:55.971 }, 00:23:55.971 { 00:23:55.971 "name": "pt3", 00:23:55.971 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:55.971 "is_configured": true, 00:23:55.971 "data_offset": 2048, 00:23:55.971 "data_size": 63488 00:23:55.971 }, 00:23:55.971 { 00:23:55.971 "name": "pt4", 00:23:55.971 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:55.971 "is_configured": true, 00:23:55.971 "data_offset": 2048, 00:23:55.971 "data_size": 63488 00:23:55.971 } 00:23:55.971 ] 00:23:55.971 }' 00:23:55.971 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.971 12:05:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:23:56.230 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:23:56.230 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:23:56.230 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:23:56.230 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:23:56.230 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:23:56.230 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:23:56.230 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:56.230 12:05:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:23:56.489 [2024-07-15 12:05:10.009742] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:56.489 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:23:56.489 "name": "raid_bdev1", 00:23:56.489 "aliases": [ 00:23:56.489 "71598cfa-53fa-4833-b24b-05bbcb76748b" 00:23:56.489 ], 00:23:56.489 "product_name": "Raid Volume", 00:23:56.489 "block_size": 512, 00:23:56.489 "num_blocks": 63488, 00:23:56.489 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:23:56.489 "assigned_rate_limits": { 00:23:56.489 "rw_ios_per_sec": 0, 00:23:56.489 "rw_mbytes_per_sec": 0, 00:23:56.489 "r_mbytes_per_sec": 0, 00:23:56.489 "w_mbytes_per_sec": 0 00:23:56.489 }, 00:23:56.489 "claimed": false, 00:23:56.489 "zoned": false, 00:23:56.489 "supported_io_types": { 00:23:56.489 "read": true, 00:23:56.489 "write": true, 00:23:56.489 "unmap": false, 00:23:56.489 "flush": false, 00:23:56.489 "reset": true, 00:23:56.489 "nvme_admin": false, 00:23:56.489 "nvme_io": false, 00:23:56.489 "nvme_io_md": false, 00:23:56.489 "write_zeroes": true, 00:23:56.489 "zcopy": false, 00:23:56.489 "get_zone_info": false, 00:23:56.489 "zone_management": false, 00:23:56.489 "zone_append": false, 00:23:56.489 "compare": false, 00:23:56.489 "compare_and_write": false, 00:23:56.489 "abort": false, 00:23:56.489 "seek_hole": false, 00:23:56.489 "seek_data": false, 00:23:56.489 "copy": false, 00:23:56.489 "nvme_iov_md": false 00:23:56.489 }, 00:23:56.489 "memory_domains": [ 00:23:56.489 { 00:23:56.489 "dma_device_id": "system", 00:23:56.489 "dma_device_type": 1 00:23:56.489 }, 00:23:56.489 { 00:23:56.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.489 "dma_device_type": 2 00:23:56.489 }, 00:23:56.489 { 00:23:56.489 "dma_device_id": "system", 00:23:56.489 "dma_device_type": 1 00:23:56.489 }, 00:23:56.489 { 00:23:56.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.489 "dma_device_type": 2 00:23:56.489 }, 00:23:56.489 { 00:23:56.489 "dma_device_id": "system", 00:23:56.489 "dma_device_type": 1 00:23:56.489 }, 00:23:56.489 { 00:23:56.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.489 "dma_device_type": 2 00:23:56.489 }, 00:23:56.489 { 00:23:56.489 "dma_device_id": "system", 00:23:56.489 "dma_device_type": 1 00:23:56.489 }, 00:23:56.489 { 00:23:56.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.489 "dma_device_type": 2 00:23:56.489 } 00:23:56.489 ], 00:23:56.490 "driver_specific": { 00:23:56.490 "raid": { 00:23:56.490 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:23:56.490 "strip_size_kb": 0, 00:23:56.490 "state": "online", 00:23:56.490 "raid_level": "raid1", 00:23:56.490 "superblock": true, 00:23:56.490 "num_base_bdevs": 4, 00:23:56.490 "num_base_bdevs_discovered": 4, 00:23:56.490 "num_base_bdevs_operational": 4, 00:23:56.490 "base_bdevs_list": [ 00:23:56.490 { 00:23:56.490 "name": "pt1", 00:23:56.490 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:56.490 "is_configured": true, 00:23:56.490 "data_offset": 2048, 00:23:56.490 "data_size": 63488 00:23:56.490 }, 00:23:56.490 { 00:23:56.490 "name": "pt2", 00:23:56.490 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:56.490 "is_configured": true, 00:23:56.490 "data_offset": 2048, 00:23:56.490 "data_size": 63488 00:23:56.490 }, 00:23:56.490 { 00:23:56.490 "name": "pt3", 00:23:56.490 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:56.490 "is_configured": true, 00:23:56.490 "data_offset": 2048, 00:23:56.490 "data_size": 63488 00:23:56.490 }, 00:23:56.490 { 00:23:56.490 "name": "pt4", 00:23:56.490 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:56.490 "is_configured": true, 00:23:56.490 "data_offset": 2048, 00:23:56.490 "data_size": 63488 00:23:56.490 } 00:23:56.490 ] 00:23:56.490 } 00:23:56.490 } 00:23:56.490 }' 00:23:56.490 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:23:56.490 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:23:56.490 pt2 00:23:56.490 pt3 00:23:56.490 pt4' 00:23:56.490 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:56.749 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:56.749 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:23:56.749 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:56.749 "name": "pt1", 00:23:56.749 "aliases": [ 00:23:56.749 "00000000-0000-0000-0000-000000000001" 00:23:56.749 ], 00:23:56.749 "product_name": "passthru", 00:23:56.749 "block_size": 512, 00:23:56.749 "num_blocks": 65536, 00:23:56.749 "uuid": "00000000-0000-0000-0000-000000000001", 00:23:56.749 "assigned_rate_limits": { 00:23:56.749 "rw_ios_per_sec": 0, 00:23:56.749 "rw_mbytes_per_sec": 0, 00:23:56.749 "r_mbytes_per_sec": 0, 00:23:56.749 "w_mbytes_per_sec": 0 00:23:56.749 }, 00:23:56.749 "claimed": true, 00:23:56.749 "claim_type": "exclusive_write", 00:23:56.749 "zoned": false, 00:23:56.749 "supported_io_types": { 00:23:56.749 "read": true, 00:23:56.749 "write": true, 00:23:56.749 "unmap": true, 00:23:56.749 "flush": true, 00:23:56.749 "reset": true, 00:23:56.749 "nvme_admin": false, 00:23:56.749 "nvme_io": false, 00:23:56.749 "nvme_io_md": false, 00:23:56.749 "write_zeroes": true, 00:23:56.749 "zcopy": true, 00:23:56.749 "get_zone_info": false, 00:23:56.749 "zone_management": false, 00:23:56.749 "zone_append": false, 00:23:56.749 "compare": false, 00:23:56.749 "compare_and_write": false, 00:23:56.749 "abort": true, 00:23:56.749 "seek_hole": false, 00:23:56.749 "seek_data": false, 00:23:56.749 "copy": true, 00:23:56.749 "nvme_iov_md": false 00:23:56.749 }, 00:23:56.749 "memory_domains": [ 00:23:56.749 { 00:23:56.749 "dma_device_id": "system", 00:23:56.749 "dma_device_type": 1 00:23:56.749 }, 00:23:56.749 { 00:23:56.749 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:56.749 "dma_device_type": 2 00:23:56.749 } 00:23:56.749 ], 00:23:56.749 "driver_specific": { 00:23:56.749 "passthru": { 00:23:56.749 "name": "pt1", 00:23:56.749 "base_bdev_name": "malloc1" 00:23:56.749 } 00:23:56.749 } 00:23:56.749 }' 00:23:56.749 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.008 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.008 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:57.008 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.008 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.008 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:57.008 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.008 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.008 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:57.008 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.267 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.267 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:57.268 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:57.268 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:23:57.268 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:57.527 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:57.527 "name": "pt2", 00:23:57.527 "aliases": [ 00:23:57.527 "00000000-0000-0000-0000-000000000002" 00:23:57.527 ], 00:23:57.527 "product_name": "passthru", 00:23:57.527 "block_size": 512, 00:23:57.527 "num_blocks": 65536, 00:23:57.527 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:57.527 "assigned_rate_limits": { 00:23:57.527 "rw_ios_per_sec": 0, 00:23:57.527 "rw_mbytes_per_sec": 0, 00:23:57.527 "r_mbytes_per_sec": 0, 00:23:57.527 "w_mbytes_per_sec": 0 00:23:57.527 }, 00:23:57.527 "claimed": true, 00:23:57.527 "claim_type": "exclusive_write", 00:23:57.527 "zoned": false, 00:23:57.527 "supported_io_types": { 00:23:57.527 "read": true, 00:23:57.527 "write": true, 00:23:57.527 "unmap": true, 00:23:57.527 "flush": true, 00:23:57.527 "reset": true, 00:23:57.527 "nvme_admin": false, 00:23:57.527 "nvme_io": false, 00:23:57.527 "nvme_io_md": false, 00:23:57.527 "write_zeroes": true, 00:23:57.527 "zcopy": true, 00:23:57.527 "get_zone_info": false, 00:23:57.527 "zone_management": false, 00:23:57.527 "zone_append": false, 00:23:57.527 "compare": false, 00:23:57.527 "compare_and_write": false, 00:23:57.527 "abort": true, 00:23:57.527 "seek_hole": false, 00:23:57.527 "seek_data": false, 00:23:57.527 "copy": true, 00:23:57.527 "nvme_iov_md": false 00:23:57.527 }, 00:23:57.527 "memory_domains": [ 00:23:57.527 { 00:23:57.527 "dma_device_id": "system", 00:23:57.527 "dma_device_type": 1 00:23:57.527 }, 00:23:57.527 { 00:23:57.527 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:57.527 "dma_device_type": 2 00:23:57.527 } 00:23:57.527 ], 00:23:57.527 "driver_specific": { 00:23:57.527 "passthru": { 00:23:57.527 "name": "pt2", 00:23:57.527 "base_bdev_name": "malloc2" 00:23:57.527 } 00:23:57.527 } 00:23:57.527 }' 00:23:57.527 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.527 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:57.527 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:57.527 12:05:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.527 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:57.527 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:57.527 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.527 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:57.787 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:57.787 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.787 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:57.787 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:57.787 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:57.787 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:23:57.787 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:58.045 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:58.045 "name": "pt3", 00:23:58.045 "aliases": [ 00:23:58.045 "00000000-0000-0000-0000-000000000003" 00:23:58.045 ], 00:23:58.045 "product_name": "passthru", 00:23:58.045 "block_size": 512, 00:23:58.045 "num_blocks": 65536, 00:23:58.045 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:58.045 "assigned_rate_limits": { 00:23:58.045 "rw_ios_per_sec": 0, 00:23:58.045 "rw_mbytes_per_sec": 0, 00:23:58.045 "r_mbytes_per_sec": 0, 00:23:58.045 "w_mbytes_per_sec": 0 00:23:58.045 }, 00:23:58.045 "claimed": true, 00:23:58.045 "claim_type": "exclusive_write", 00:23:58.045 "zoned": false, 00:23:58.045 "supported_io_types": { 00:23:58.045 "read": true, 00:23:58.045 "write": true, 00:23:58.045 "unmap": true, 00:23:58.045 "flush": true, 00:23:58.045 "reset": true, 00:23:58.045 "nvme_admin": false, 00:23:58.045 "nvme_io": false, 00:23:58.045 "nvme_io_md": false, 00:23:58.045 "write_zeroes": true, 00:23:58.045 "zcopy": true, 00:23:58.045 "get_zone_info": false, 00:23:58.045 "zone_management": false, 00:23:58.045 "zone_append": false, 00:23:58.045 "compare": false, 00:23:58.045 "compare_and_write": false, 00:23:58.045 "abort": true, 00:23:58.045 "seek_hole": false, 00:23:58.045 "seek_data": false, 00:23:58.045 "copy": true, 00:23:58.045 "nvme_iov_md": false 00:23:58.045 }, 00:23:58.045 "memory_domains": [ 00:23:58.045 { 00:23:58.045 "dma_device_id": "system", 00:23:58.045 "dma_device_type": 1 00:23:58.045 }, 00:23:58.045 { 00:23:58.045 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:58.045 "dma_device_type": 2 00:23:58.045 } 00:23:58.045 ], 00:23:58.045 "driver_specific": { 00:23:58.045 "passthru": { 00:23:58.045 "name": "pt3", 00:23:58.045 "base_bdev_name": "malloc3" 00:23:58.045 } 00:23:58.045 } 00:23:58.045 }' 00:23:58.045 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:58.045 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:58.045 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:58.046 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:58.046 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:58.046 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:58.046 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:58.304 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:58.304 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:58.304 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:58.304 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:58.304 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:58.304 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:23:58.304 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:23:58.304 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:23:58.563 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:23:58.563 "name": "pt4", 00:23:58.563 "aliases": [ 00:23:58.563 "00000000-0000-0000-0000-000000000004" 00:23:58.563 ], 00:23:58.563 "product_name": "passthru", 00:23:58.563 "block_size": 512, 00:23:58.563 "num_blocks": 65536, 00:23:58.563 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:58.563 "assigned_rate_limits": { 00:23:58.563 "rw_ios_per_sec": 0, 00:23:58.563 "rw_mbytes_per_sec": 0, 00:23:58.563 "r_mbytes_per_sec": 0, 00:23:58.563 "w_mbytes_per_sec": 0 00:23:58.563 }, 00:23:58.563 "claimed": true, 00:23:58.563 "claim_type": "exclusive_write", 00:23:58.563 "zoned": false, 00:23:58.563 "supported_io_types": { 00:23:58.563 "read": true, 00:23:58.563 "write": true, 00:23:58.563 "unmap": true, 00:23:58.563 "flush": true, 00:23:58.563 "reset": true, 00:23:58.563 "nvme_admin": false, 00:23:58.563 "nvme_io": false, 00:23:58.563 "nvme_io_md": false, 00:23:58.563 "write_zeroes": true, 00:23:58.563 "zcopy": true, 00:23:58.563 "get_zone_info": false, 00:23:58.563 "zone_management": false, 00:23:58.563 "zone_append": false, 00:23:58.563 "compare": false, 00:23:58.563 "compare_and_write": false, 00:23:58.563 "abort": true, 00:23:58.563 "seek_hole": false, 00:23:58.563 "seek_data": false, 00:23:58.563 "copy": true, 00:23:58.563 "nvme_iov_md": false 00:23:58.563 }, 00:23:58.563 "memory_domains": [ 00:23:58.563 { 00:23:58.563 "dma_device_id": "system", 00:23:58.563 "dma_device_type": 1 00:23:58.563 }, 00:23:58.563 { 00:23:58.563 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:23:58.563 "dma_device_type": 2 00:23:58.563 } 00:23:58.563 ], 00:23:58.563 "driver_specific": { 00:23:58.563 "passthru": { 00:23:58.563 "name": "pt4", 00:23:58.563 "base_bdev_name": "malloc4" 00:23:58.563 } 00:23:58.563 } 00:23:58.563 }' 00:23:58.563 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:58.563 12:05:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:23:58.563 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:23:58.563 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:58.563 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:23:58.563 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:23:58.563 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:58.823 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:23:58.823 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:23:58.823 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:58.823 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:23:58.823 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:23:58.823 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:58.823 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:23:59.082 [2024-07-15 12:05:12.432185] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:59.082 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 71598cfa-53fa-4833-b24b-05bbcb76748b '!=' 71598cfa-53fa-4833-b24b-05bbcb76748b ']' 00:23:59.082 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:23:59.082 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:23:59.082 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:23:59.082 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:23:59.342 [2024-07-15 12:05:12.684573] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.342 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:59.601 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:59.601 "name": "raid_bdev1", 00:23:59.601 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:23:59.601 "strip_size_kb": 0, 00:23:59.601 "state": "online", 00:23:59.601 "raid_level": "raid1", 00:23:59.601 "superblock": true, 00:23:59.601 "num_base_bdevs": 4, 00:23:59.601 "num_base_bdevs_discovered": 3, 00:23:59.601 "num_base_bdevs_operational": 3, 00:23:59.601 "base_bdevs_list": [ 00:23:59.601 { 00:23:59.601 "name": null, 00:23:59.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:59.601 "is_configured": false, 00:23:59.601 "data_offset": 2048, 00:23:59.601 "data_size": 63488 00:23:59.601 }, 00:23:59.601 { 00:23:59.601 "name": "pt2", 00:23:59.601 "uuid": "00000000-0000-0000-0000-000000000002", 00:23:59.601 "is_configured": true, 00:23:59.601 "data_offset": 2048, 00:23:59.601 "data_size": 63488 00:23:59.601 }, 00:23:59.601 { 00:23:59.601 "name": "pt3", 00:23:59.601 "uuid": "00000000-0000-0000-0000-000000000003", 00:23:59.601 "is_configured": true, 00:23:59.601 "data_offset": 2048, 00:23:59.601 "data_size": 63488 00:23:59.601 }, 00:23:59.601 { 00:23:59.601 "name": "pt4", 00:23:59.601 "uuid": "00000000-0000-0000-0000-000000000004", 00:23:59.601 "is_configured": true, 00:23:59.601 "data_offset": 2048, 00:23:59.601 "data_size": 63488 00:23:59.601 } 00:23:59.601 ] 00:23:59.601 }' 00:23:59.601 12:05:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:59.601 12:05:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:00.167 12:05:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:00.167 [2024-07-15 12:05:13.731303] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:00.167 [2024-07-15 12:05:13.731331] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:00.167 [2024-07-15 12:05:13.731390] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:00.167 [2024-07-15 12:05:13.731461] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:00.167 [2024-07-15 12:05:13.731473] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe28f60 name raid_bdev1, state offline 00:24:00.427 12:05:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.427 12:05:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:24:00.686 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:24:00.686 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:24:00.686 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:24:00.686 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:00.686 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:24:00.945 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:00.945 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:00.945 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:24:01.204 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:01.204 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:01.204 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:24:01.204 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:24:01.204 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:24:01.204 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:24:01.204 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:01.204 12:05:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:24:01.464 [2024-07-15 12:05:15.014682] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:24:01.464 [2024-07-15 12:05:15.014741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:01.464 [2024-07-15 12:05:15.014758] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed21f0 00:24:01.464 [2024-07-15 12:05:15.014770] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:01.464 [2024-07-15 12:05:15.016366] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:01.464 [2024-07-15 12:05:15.016393] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:24:01.464 [2024-07-15 12:05:15.016454] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:24:01.464 [2024-07-15 12:05:15.016478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:01.464 pt2 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.464 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:01.723 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:01.723 "name": "raid_bdev1", 00:24:01.723 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:24:01.723 "strip_size_kb": 0, 00:24:01.723 "state": "configuring", 00:24:01.723 "raid_level": "raid1", 00:24:01.723 "superblock": true, 00:24:01.723 "num_base_bdevs": 4, 00:24:01.723 "num_base_bdevs_discovered": 1, 00:24:01.723 "num_base_bdevs_operational": 3, 00:24:01.723 "base_bdevs_list": [ 00:24:01.723 { 00:24:01.723 "name": null, 00:24:01.723 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:01.723 "is_configured": false, 00:24:01.723 "data_offset": 2048, 00:24:01.723 "data_size": 63488 00:24:01.723 }, 00:24:01.724 { 00:24:01.724 "name": "pt2", 00:24:01.724 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:01.724 "is_configured": true, 00:24:01.724 "data_offset": 2048, 00:24:01.724 "data_size": 63488 00:24:01.724 }, 00:24:01.724 { 00:24:01.724 "name": null, 00:24:01.724 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:01.724 "is_configured": false, 00:24:01.724 "data_offset": 2048, 00:24:01.724 "data_size": 63488 00:24:01.724 }, 00:24:01.724 { 00:24:01.724 "name": null, 00:24:01.724 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:01.724 "is_configured": false, 00:24:01.724 "data_offset": 2048, 00:24:01.724 "data_size": 63488 00:24:01.724 } 00:24:01.724 ] 00:24:01.724 }' 00:24:01.724 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:01.724 12:05:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:02.292 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:24:02.292 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:02.292 12:05:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:24:02.551 [2024-07-15 12:05:16.105595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:24:02.551 [2024-07-15 12:05:16.105647] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:02.551 [2024-07-15 12:05:16.105669] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe32790 00:24:02.551 [2024-07-15 12:05:16.105683] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:02.551 [2024-07-15 12:05:16.106018] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:02.551 [2024-07-15 12:05:16.106035] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:24:02.551 [2024-07-15 12:05:16.106093] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:24:02.551 [2024-07-15 12:05:16.106113] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:02.551 pt3 00:24:02.551 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:24:02.551 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:02.551 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:02.551 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:02.551 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:02.552 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:02.552 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:02.552 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:02.552 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:02.552 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:02.552 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:02.552 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.120 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:03.120 "name": "raid_bdev1", 00:24:03.120 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:24:03.120 "strip_size_kb": 0, 00:24:03.120 "state": "configuring", 00:24:03.120 "raid_level": "raid1", 00:24:03.120 "superblock": true, 00:24:03.120 "num_base_bdevs": 4, 00:24:03.120 "num_base_bdevs_discovered": 2, 00:24:03.120 "num_base_bdevs_operational": 3, 00:24:03.120 "base_bdevs_list": [ 00:24:03.120 { 00:24:03.120 "name": null, 00:24:03.120 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.120 "is_configured": false, 00:24:03.120 "data_offset": 2048, 00:24:03.120 "data_size": 63488 00:24:03.120 }, 00:24:03.120 { 00:24:03.120 "name": "pt2", 00:24:03.120 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:03.120 "is_configured": true, 00:24:03.120 "data_offset": 2048, 00:24:03.120 "data_size": 63488 00:24:03.120 }, 00:24:03.120 { 00:24:03.120 "name": "pt3", 00:24:03.120 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:03.120 "is_configured": true, 00:24:03.120 "data_offset": 2048, 00:24:03.120 "data_size": 63488 00:24:03.120 }, 00:24:03.120 { 00:24:03.120 "name": null, 00:24:03.120 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:03.120 "is_configured": false, 00:24:03.120 "data_offset": 2048, 00:24:03.120 "data_size": 63488 00:24:03.120 } 00:24:03.120 ] 00:24:03.120 }' 00:24:03.120 12:05:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:03.120 12:05:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:03.796 [2024-07-15 12:05:17.264676] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:03.796 [2024-07-15 12:05:17.264734] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:03.796 [2024-07-15 12:05:17.264754] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed9680 00:24:03.796 [2024-07-15 12:05:17.264766] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:03.796 [2024-07-15 12:05:17.265099] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:03.796 [2024-07-15 12:05:17.265116] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:03.796 [2024-07-15 12:05:17.265174] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:24:03.796 [2024-07-15 12:05:17.265193] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:03.796 [2024-07-15 12:05:17.265303] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xed5780 00:24:03.796 [2024-07-15 12:05:17.265314] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:03.796 [2024-07-15 12:05:17.265487] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe330e0 00:24:03.796 [2024-07-15 12:05:17.265616] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xed5780 00:24:03.796 [2024-07-15 12:05:17.265626] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xed5780 00:24:03.796 [2024-07-15 12:05:17.265732] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:03.796 pt4 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.796 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.055 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:04.055 "name": "raid_bdev1", 00:24:04.055 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:24:04.055 "strip_size_kb": 0, 00:24:04.055 "state": "online", 00:24:04.055 "raid_level": "raid1", 00:24:04.055 "superblock": true, 00:24:04.055 "num_base_bdevs": 4, 00:24:04.055 "num_base_bdevs_discovered": 3, 00:24:04.055 "num_base_bdevs_operational": 3, 00:24:04.055 "base_bdevs_list": [ 00:24:04.055 { 00:24:04.055 "name": null, 00:24:04.055 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.055 "is_configured": false, 00:24:04.055 "data_offset": 2048, 00:24:04.055 "data_size": 63488 00:24:04.055 }, 00:24:04.055 { 00:24:04.055 "name": "pt2", 00:24:04.055 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:04.055 "is_configured": true, 00:24:04.055 "data_offset": 2048, 00:24:04.055 "data_size": 63488 00:24:04.055 }, 00:24:04.055 { 00:24:04.055 "name": "pt3", 00:24:04.055 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:04.055 "is_configured": true, 00:24:04.055 "data_offset": 2048, 00:24:04.055 "data_size": 63488 00:24:04.055 }, 00:24:04.055 { 00:24:04.055 "name": "pt4", 00:24:04.055 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:04.055 "is_configured": true, 00:24:04.055 "data_offset": 2048, 00:24:04.055 "data_size": 63488 00:24:04.055 } 00:24:04.055 ] 00:24:04.055 }' 00:24:04.055 12:05:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:04.055 12:05:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:04.623 12:05:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:04.883 [2024-07-15 12:05:18.395654] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:04.883 [2024-07-15 12:05:18.395680] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:04.883 [2024-07-15 12:05:18.395742] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:04.883 [2024-07-15 12:05:18.395808] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:04.883 [2024-07-15 12:05:18.395820] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xed5780 name raid_bdev1, state offline 00:24:04.883 12:05:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.883 12:05:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:24:05.141 12:05:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:24:05.141 12:05:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:24:05.141 12:05:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:24:05.141 12:05:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:24:05.141 12:05:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:24:05.399 12:05:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:24:05.658 [2024-07-15 12:05:19.137576] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:24:05.658 [2024-07-15 12:05:19.137621] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:05.658 [2024-07-15 12:05:19.137638] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe29890 00:24:05.658 [2024-07-15 12:05:19.137651] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:05.658 [2024-07-15 12:05:19.139259] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:05.658 [2024-07-15 12:05:19.139286] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:24:05.658 [2024-07-15 12:05:19.139350] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:24:05.658 [2024-07-15 12:05:19.139376] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:24:05.658 [2024-07-15 12:05:19.139477] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:24:05.658 [2024-07-15 12:05:19.139490] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:05.658 [2024-07-15 12:05:19.139505] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xed61b0 name raid_bdev1, state configuring 00:24:05.658 [2024-07-15 12:05:19.139528] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:24:05.658 [2024-07-15 12:05:19.139605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:24:05.658 pt1 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.658 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.917 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.917 "name": "raid_bdev1", 00:24:05.917 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:24:05.917 "strip_size_kb": 0, 00:24:05.917 "state": "configuring", 00:24:05.917 "raid_level": "raid1", 00:24:05.917 "superblock": true, 00:24:05.917 "num_base_bdevs": 4, 00:24:05.917 "num_base_bdevs_discovered": 2, 00:24:05.917 "num_base_bdevs_operational": 3, 00:24:05.917 "base_bdevs_list": [ 00:24:05.917 { 00:24:05.917 "name": null, 00:24:05.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.917 "is_configured": false, 00:24:05.917 "data_offset": 2048, 00:24:05.917 "data_size": 63488 00:24:05.917 }, 00:24:05.917 { 00:24:05.917 "name": "pt2", 00:24:05.917 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:05.917 "is_configured": true, 00:24:05.917 "data_offset": 2048, 00:24:05.917 "data_size": 63488 00:24:05.917 }, 00:24:05.917 { 00:24:05.917 "name": "pt3", 00:24:05.917 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:05.917 "is_configured": true, 00:24:05.917 "data_offset": 2048, 00:24:05.917 "data_size": 63488 00:24:05.917 }, 00:24:05.917 { 00:24:05.917 "name": null, 00:24:05.917 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:05.917 "is_configured": false, 00:24:05.917 "data_offset": 2048, 00:24:05.917 "data_size": 63488 00:24:05.917 } 00:24:05.917 ] 00:24:05.917 }' 00:24:05.917 12:05:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.917 12:05:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:06.485 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:24:06.485 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:06.743 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:24:06.743 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:24:07.002 [2024-07-15 12:05:20.517242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:24:07.002 [2024-07-15 12:05:20.517295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:07.002 [2024-07-15 12:05:20.517314] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xed5a00 00:24:07.002 [2024-07-15 12:05:20.517326] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:07.002 [2024-07-15 12:05:20.517659] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:07.002 [2024-07-15 12:05:20.517676] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:24:07.002 [2024-07-15 12:05:20.517751] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:24:07.002 [2024-07-15 12:05:20.517773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:24:07.002 [2024-07-15 12:05:20.517890] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xed33f0 00:24:07.002 [2024-07-15 12:05:20.517900] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:07.002 [2024-07-15 12:05:20.518073] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe31df0 00:24:07.002 [2024-07-15 12:05:20.518204] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xed33f0 00:24:07.002 [2024-07-15 12:05:20.518214] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xed33f0 00:24:07.002 [2024-07-15 12:05:20.518312] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:07.002 pt4 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.002 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.261 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:07.261 "name": "raid_bdev1", 00:24:07.261 "uuid": "71598cfa-53fa-4833-b24b-05bbcb76748b", 00:24:07.261 "strip_size_kb": 0, 00:24:07.261 "state": "online", 00:24:07.261 "raid_level": "raid1", 00:24:07.261 "superblock": true, 00:24:07.261 "num_base_bdevs": 4, 00:24:07.261 "num_base_bdevs_discovered": 3, 00:24:07.261 "num_base_bdevs_operational": 3, 00:24:07.261 "base_bdevs_list": [ 00:24:07.261 { 00:24:07.261 "name": null, 00:24:07.261 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.261 "is_configured": false, 00:24:07.261 "data_offset": 2048, 00:24:07.261 "data_size": 63488 00:24:07.261 }, 00:24:07.261 { 00:24:07.261 "name": "pt2", 00:24:07.261 "uuid": "00000000-0000-0000-0000-000000000002", 00:24:07.261 "is_configured": true, 00:24:07.261 "data_offset": 2048, 00:24:07.261 "data_size": 63488 00:24:07.261 }, 00:24:07.261 { 00:24:07.261 "name": "pt3", 00:24:07.261 "uuid": "00000000-0000-0000-0000-000000000003", 00:24:07.261 "is_configured": true, 00:24:07.261 "data_offset": 2048, 00:24:07.261 "data_size": 63488 00:24:07.261 }, 00:24:07.261 { 00:24:07.261 "name": "pt4", 00:24:07.261 "uuid": "00000000-0000-0000-0000-000000000004", 00:24:07.261 "is_configured": true, 00:24:07.261 "data_offset": 2048, 00:24:07.261 "data_size": 63488 00:24:07.261 } 00:24:07.261 ] 00:24:07.261 }' 00:24:07.261 12:05:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:07.261 12:05:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:07.827 12:05:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:24:07.827 12:05:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:24:08.084 12:05:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:24:08.084 12:05:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:08.084 12:05:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:24:08.343 [2024-07-15 12:05:21.857061] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 71598cfa-53fa-4833-b24b-05bbcb76748b '!=' 71598cfa-53fa-4833-b24b-05bbcb76748b ']' 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 1557368 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@948 -- # '[' -z 1557368 ']' 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@952 -- # kill -0 1557368 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # uname 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1557368 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1557368' 00:24:08.343 killing process with pid 1557368 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # kill 1557368 00:24:08.343 [2024-07-15 12:05:21.927743] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:08.343 [2024-07-15 12:05:21.927800] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:08.343 [2024-07-15 12:05:21.927872] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:08.343 [2024-07-15 12:05:21.927884] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xed33f0 name raid_bdev1, state offline 00:24:08.343 12:05:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@972 -- # wait 1557368 00:24:08.601 [2024-07-15 12:05:21.963924] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:08.601 12:05:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:24:08.601 00:24:08.601 real 0m26.350s 00:24:08.601 user 0m48.313s 00:24:08.601 sys 0m4.646s 00:24:08.601 12:05:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:08.601 12:05:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:24:08.601 ************************************ 00:24:08.601 END TEST raid_superblock_test 00:24:08.601 ************************************ 00:24:08.860 12:05:22 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:08.860 12:05:22 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:24:08.860 12:05:22 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:24:08.860 12:05:22 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:08.860 12:05:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:08.860 ************************************ 00:24:08.860 START TEST raid_read_error_test 00:24:08.860 ************************************ 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 read 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.KMLtBCRWBQ 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1561355 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1561355 /var/tmp/spdk-raid.sock 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@829 -- # '[' -z 1561355 ']' 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:08.860 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:08.860 12:05:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:08.860 [2024-07-15 12:05:22.327431] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:24:08.860 [2024-07-15 12:05:22.327503] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1561355 ] 00:24:09.120 [2024-07-15 12:05:22.457249] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:09.120 [2024-07-15 12:05:22.558920] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:09.120 [2024-07-15 12:05:22.613280] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:09.120 [2024-07-15 12:05:22.613313] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:09.688 12:05:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:09.688 12:05:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@862 -- # return 0 00:24:09.688 12:05:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:09.688 12:05:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:09.947 BaseBdev1_malloc 00:24:09.947 12:05:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:24:10.206 true 00:24:10.206 12:05:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:24:10.465 [2024-07-15 12:05:23.980506] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:24:10.465 [2024-07-15 12:05:23.980552] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:10.465 [2024-07-15 12:05:23.980572] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19214e0 00:24:10.465 [2024-07-15 12:05:23.980585] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:10.465 [2024-07-15 12:05:23.982339] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:10.465 [2024-07-15 12:05:23.982372] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:10.465 BaseBdev1 00:24:10.465 12:05:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:10.465 12:05:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:10.724 BaseBdev2_malloc 00:24:10.724 12:05:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:24:10.982 true 00:24:10.982 12:05:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:24:11.240 [2024-07-15 12:05:24.707026] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:24:11.240 [2024-07-15 12:05:24.707069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:11.240 [2024-07-15 12:05:24.707089] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19267b0 00:24:11.240 [2024-07-15 12:05:24.707102] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:11.240 [2024-07-15 12:05:24.708650] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:11.240 [2024-07-15 12:05:24.708677] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:11.240 BaseBdev2 00:24:11.240 12:05:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:11.240 12:05:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:11.498 BaseBdev3_malloc 00:24:11.498 12:05:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:24:11.756 true 00:24:11.756 12:05:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:24:12.015 [2024-07-15 12:05:25.442996] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:24:12.015 [2024-07-15 12:05:25.443045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:12.015 [2024-07-15 12:05:25.443067] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x19288f0 00:24:12.015 [2024-07-15 12:05:25.443080] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:12.015 [2024-07-15 12:05:25.444672] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:12.015 [2024-07-15 12:05:25.444707] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:12.015 BaseBdev3 00:24:12.015 12:05:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:12.015 12:05:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:12.275 BaseBdev4_malloc 00:24:12.275 12:05:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:24:12.534 true 00:24:12.534 12:05:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:24:12.793 [2024-07-15 12:05:26.162706] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:24:12.793 [2024-07-15 12:05:26.162749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:12.793 [2024-07-15 12:05:26.162769] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x192adc0 00:24:12.793 [2024-07-15 12:05:26.162782] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:12.793 [2024-07-15 12:05:26.164311] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:12.793 [2024-07-15 12:05:26.164339] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:12.793 BaseBdev4 00:24:12.793 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:24:13.053 [2024-07-15 12:05:26.391334] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:13.053 [2024-07-15 12:05:26.392633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:13.053 [2024-07-15 12:05:26.392708] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:13.053 [2024-07-15 12:05:26.392767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:13.053 [2024-07-15 12:05:26.392998] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1929090 00:24:13.053 [2024-07-15 12:05:26.393009] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:13.053 [2024-07-15 12:05:26.393203] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x192bbf0 00:24:13.053 [2024-07-15 12:05:26.393356] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1929090 00:24:13.053 [2024-07-15 12:05:26.393366] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1929090 00:24:13.053 [2024-07-15 12:05:26.393470] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.053 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.312 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:13.312 "name": "raid_bdev1", 00:24:13.312 "uuid": "acc065ff-fb41-4f08-8c11-a62192e8d11e", 00:24:13.312 "strip_size_kb": 0, 00:24:13.312 "state": "online", 00:24:13.312 "raid_level": "raid1", 00:24:13.312 "superblock": true, 00:24:13.312 "num_base_bdevs": 4, 00:24:13.312 "num_base_bdevs_discovered": 4, 00:24:13.312 "num_base_bdevs_operational": 4, 00:24:13.312 "base_bdevs_list": [ 00:24:13.312 { 00:24:13.312 "name": "BaseBdev1", 00:24:13.312 "uuid": "ef901cdd-8c9e-547c-b057-ebff2e38304f", 00:24:13.312 "is_configured": true, 00:24:13.312 "data_offset": 2048, 00:24:13.312 "data_size": 63488 00:24:13.312 }, 00:24:13.312 { 00:24:13.312 "name": "BaseBdev2", 00:24:13.312 "uuid": "c30c292e-f119-5fa6-a9d0-a745cf7abb51", 00:24:13.312 "is_configured": true, 00:24:13.312 "data_offset": 2048, 00:24:13.312 "data_size": 63488 00:24:13.312 }, 00:24:13.312 { 00:24:13.312 "name": "BaseBdev3", 00:24:13.312 "uuid": "ac177d25-fc85-5f96-813b-993e77fc235f", 00:24:13.312 "is_configured": true, 00:24:13.312 "data_offset": 2048, 00:24:13.312 "data_size": 63488 00:24:13.312 }, 00:24:13.312 { 00:24:13.312 "name": "BaseBdev4", 00:24:13.312 "uuid": "8d5626d8-532b-5120-8a17-ae495c62655a", 00:24:13.312 "is_configured": true, 00:24:13.312 "data_offset": 2048, 00:24:13.312 "data_size": 63488 00:24:13.312 } 00:24:13.312 ] 00:24:13.312 }' 00:24:13.312 12:05:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:13.312 12:05:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:14.249 12:05:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:24:14.249 12:05:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:14.249 [2024-07-15 12:05:27.650948] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x192f4b0 00:24:15.186 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:15.445 12:05:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:15.704 12:05:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:15.704 "name": "raid_bdev1", 00:24:15.704 "uuid": "acc065ff-fb41-4f08-8c11-a62192e8d11e", 00:24:15.704 "strip_size_kb": 0, 00:24:15.704 "state": "online", 00:24:15.704 "raid_level": "raid1", 00:24:15.704 "superblock": true, 00:24:15.704 "num_base_bdevs": 4, 00:24:15.704 "num_base_bdevs_discovered": 4, 00:24:15.704 "num_base_bdevs_operational": 4, 00:24:15.704 "base_bdevs_list": [ 00:24:15.704 { 00:24:15.704 "name": "BaseBdev1", 00:24:15.704 "uuid": "ef901cdd-8c9e-547c-b057-ebff2e38304f", 00:24:15.704 "is_configured": true, 00:24:15.704 "data_offset": 2048, 00:24:15.704 "data_size": 63488 00:24:15.704 }, 00:24:15.704 { 00:24:15.704 "name": "BaseBdev2", 00:24:15.704 "uuid": "c30c292e-f119-5fa6-a9d0-a745cf7abb51", 00:24:15.704 "is_configured": true, 00:24:15.704 "data_offset": 2048, 00:24:15.704 "data_size": 63488 00:24:15.704 }, 00:24:15.704 { 00:24:15.704 "name": "BaseBdev3", 00:24:15.704 "uuid": "ac177d25-fc85-5f96-813b-993e77fc235f", 00:24:15.704 "is_configured": true, 00:24:15.704 "data_offset": 2048, 00:24:15.704 "data_size": 63488 00:24:15.704 }, 00:24:15.704 { 00:24:15.704 "name": "BaseBdev4", 00:24:15.704 "uuid": "8d5626d8-532b-5120-8a17-ae495c62655a", 00:24:15.704 "is_configured": true, 00:24:15.704 "data_offset": 2048, 00:24:15.704 "data_size": 63488 00:24:15.704 } 00:24:15.704 ] 00:24:15.704 }' 00:24:15.704 12:05:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:15.704 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:16.271 12:05:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:16.271 [2024-07-15 12:05:29.820655] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:16.271 [2024-07-15 12:05:29.820705] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:16.271 [2024-07-15 12:05:29.823929] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:16.271 [2024-07-15 12:05:29.823976] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:16.271 [2024-07-15 12:05:29.824091] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:16.271 [2024-07-15 12:05:29.824103] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1929090 name raid_bdev1, state offline 00:24:16.271 0 00:24:16.271 12:05:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1561355 00:24:16.271 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@948 -- # '[' -z 1561355 ']' 00:24:16.271 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@952 -- # kill -0 1561355 00:24:16.271 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # uname 00:24:16.271 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:16.271 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1561355 00:24:16.530 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:16.530 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:16.530 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1561355' 00:24:16.530 killing process with pid 1561355 00:24:16.530 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # kill 1561355 00:24:16.530 [2024-07-15 12:05:29.905648] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:16.530 12:05:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@972 -- # wait 1561355 00:24:16.530 [2024-07-15 12:05:29.936983] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.KMLtBCRWBQ 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:24:16.788 00:24:16.788 real 0m7.918s 00:24:16.788 user 0m12.822s 00:24:16.788 sys 0m1.348s 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:16.788 12:05:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:16.788 ************************************ 00:24:16.788 END TEST raid_read_error_test 00:24:16.788 ************************************ 00:24:16.788 12:05:30 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:16.788 12:05:30 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:24:16.788 12:05:30 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:24:16.788 12:05:30 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:16.788 12:05:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:16.788 ************************************ 00:24:16.788 START TEST raid_write_error_test 00:24:16.788 ************************************ 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1123 -- # raid_io_error_test raid1 4 write 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.CO36ZkCOBz 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=1562376 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 1562376 /var/tmp/spdk-raid.sock 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@829 -- # '[' -z 1562376 ']' 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:16.788 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:16.788 12:05:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:16.788 [2024-07-15 12:05:30.379703] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:24:16.788 [2024-07-15 12:05:30.379835] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1562376 ] 00:24:17.070 [2024-07-15 12:05:30.567273] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:17.070 [2024-07-15 12:05:30.664069] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:17.328 [2024-07-15 12:05:30.732472] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:17.328 [2024-07-15 12:05:30.732507] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:17.894 12:05:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:17.894 12:05:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@862 -- # return 0 00:24:17.894 12:05:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:17.894 12:05:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:18.153 BaseBdev1_malloc 00:24:18.153 12:05:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:24:18.720 true 00:24:18.721 12:05:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:24:18.979 [2024-07-15 12:05:32.395006] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:24:18.979 [2024-07-15 12:05:32.395054] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:18.979 [2024-07-15 12:05:32.395074] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x294d4e0 00:24:18.979 [2024-07-15 12:05:32.395087] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:18.979 [2024-07-15 12:05:32.396850] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:18.979 [2024-07-15 12:05:32.396879] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:18.979 BaseBdev1 00:24:18.979 12:05:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:18.979 12:05:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:19.237 BaseBdev2_malloc 00:24:19.237 12:05:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:24:19.804 true 00:24:19.804 12:05:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:24:20.370 [2024-07-15 12:05:33.666916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:24:20.370 [2024-07-15 12:05:33.666967] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.370 [2024-07-15 12:05:33.666988] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x29527b0 00:24:20.370 [2024-07-15 12:05:33.667001] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.370 [2024-07-15 12:05:33.668620] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.370 [2024-07-15 12:05:33.668647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:20.370 BaseBdev2 00:24:20.370 12:05:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:20.370 12:05:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:20.629 BaseBdev3_malloc 00:24:20.629 12:05:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:24:20.887 true 00:24:20.887 12:05:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:24:21.455 [2024-07-15 12:05:34.743648] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:24:21.455 [2024-07-15 12:05:34.743703] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:21.455 [2024-07-15 12:05:34.743725] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x29548f0 00:24:21.455 [2024-07-15 12:05:34.743738] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:21.455 [2024-07-15 12:05:34.745350] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:21.455 [2024-07-15 12:05:34.745384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:21.455 BaseBdev3 00:24:21.455 12:05:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:24:21.455 12:05:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:21.455 BaseBdev4_malloc 00:24:21.455 12:05:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:24:21.714 true 00:24:21.714 12:05:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:24:22.282 [2024-07-15 12:05:35.754939] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:24:22.282 [2024-07-15 12:05:35.754989] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:22.283 [2024-07-15 12:05:35.755011] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2956dc0 00:24:22.283 [2024-07-15 12:05:35.755024] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:22.283 [2024-07-15 12:05:35.756693] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:22.283 [2024-07-15 12:05:35.756721] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:22.283 BaseBdev4 00:24:22.283 12:05:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:24:22.850 [2024-07-15 12:05:36.268303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:22.850 [2024-07-15 12:05:36.269697] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:22.850 [2024-07-15 12:05:36.269767] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:22.850 [2024-07-15 12:05:36.269826] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:22.850 [2024-07-15 12:05:36.270060] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2955090 00:24:22.850 [2024-07-15 12:05:36.270071] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:22.850 [2024-07-15 12:05:36.270275] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2957bf0 00:24:22.850 [2024-07-15 12:05:36.270433] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2955090 00:24:22.850 [2024-07-15 12:05:36.270444] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2955090 00:24:22.850 [2024-07-15 12:05:36.270554] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.850 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.419 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.419 "name": "raid_bdev1", 00:24:23.419 "uuid": "8effc169-7919-447f-9344-6f81833160a2", 00:24:23.419 "strip_size_kb": 0, 00:24:23.419 "state": "online", 00:24:23.419 "raid_level": "raid1", 00:24:23.419 "superblock": true, 00:24:23.419 "num_base_bdevs": 4, 00:24:23.419 "num_base_bdevs_discovered": 4, 00:24:23.419 "num_base_bdevs_operational": 4, 00:24:23.419 "base_bdevs_list": [ 00:24:23.419 { 00:24:23.419 "name": "BaseBdev1", 00:24:23.419 "uuid": "259c3ea1-e8dc-524b-bf7d-d7141ddae52b", 00:24:23.419 "is_configured": true, 00:24:23.419 "data_offset": 2048, 00:24:23.419 "data_size": 63488 00:24:23.419 }, 00:24:23.419 { 00:24:23.419 "name": "BaseBdev2", 00:24:23.419 "uuid": "30412b52-b3ec-5dac-97d8-be3790286d7e", 00:24:23.419 "is_configured": true, 00:24:23.419 "data_offset": 2048, 00:24:23.419 "data_size": 63488 00:24:23.419 }, 00:24:23.419 { 00:24:23.419 "name": "BaseBdev3", 00:24:23.419 "uuid": "616718e0-9d43-57ef-8150-8c93f4a76967", 00:24:23.419 "is_configured": true, 00:24:23.419 "data_offset": 2048, 00:24:23.419 "data_size": 63488 00:24:23.419 }, 00:24:23.419 { 00:24:23.419 "name": "BaseBdev4", 00:24:23.419 "uuid": "d30d8a5d-b03b-5fed-8746-7cb887274e0f", 00:24:23.419 "is_configured": true, 00:24:23.419 "data_offset": 2048, 00:24:23.419 "data_size": 63488 00:24:23.419 } 00:24:23.419 ] 00:24:23.419 }' 00:24:23.419 12:05:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.419 12:05:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:24.385 12:05:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:24:24.385 12:05:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:24.385 [2024-07-15 12:05:37.796615] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x295b4b0 00:24:25.323 12:05:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:24:25.586 [2024-07-15 12:05:39.170306] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:24:25.586 [2024-07-15 12:05:39.170366] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:25.586 [2024-07-15 12:05:39.170585] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x295b4b0 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:25.905 "name": "raid_bdev1", 00:24:25.905 "uuid": "8effc169-7919-447f-9344-6f81833160a2", 00:24:25.905 "strip_size_kb": 0, 00:24:25.905 "state": "online", 00:24:25.905 "raid_level": "raid1", 00:24:25.905 "superblock": true, 00:24:25.905 "num_base_bdevs": 4, 00:24:25.905 "num_base_bdevs_discovered": 3, 00:24:25.905 "num_base_bdevs_operational": 3, 00:24:25.905 "base_bdevs_list": [ 00:24:25.905 { 00:24:25.905 "name": null, 00:24:25.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:25.905 "is_configured": false, 00:24:25.905 "data_offset": 2048, 00:24:25.905 "data_size": 63488 00:24:25.905 }, 00:24:25.905 { 00:24:25.905 "name": "BaseBdev2", 00:24:25.905 "uuid": "30412b52-b3ec-5dac-97d8-be3790286d7e", 00:24:25.905 "is_configured": true, 00:24:25.905 "data_offset": 2048, 00:24:25.905 "data_size": 63488 00:24:25.905 }, 00:24:25.905 { 00:24:25.905 "name": "BaseBdev3", 00:24:25.905 "uuid": "616718e0-9d43-57ef-8150-8c93f4a76967", 00:24:25.905 "is_configured": true, 00:24:25.905 "data_offset": 2048, 00:24:25.905 "data_size": 63488 00:24:25.905 }, 00:24:25.905 { 00:24:25.905 "name": "BaseBdev4", 00:24:25.905 "uuid": "d30d8a5d-b03b-5fed-8746-7cb887274e0f", 00:24:25.905 "is_configured": true, 00:24:25.905 "data_offset": 2048, 00:24:25.905 "data_size": 63488 00:24:25.905 } 00:24:25.905 ] 00:24:25.905 }' 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:25.905 12:05:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:26.841 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:27.100 [2024-07-15 12:05:40.593237] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:27.100 [2024-07-15 12:05:40.593280] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:27.100 [2024-07-15 12:05:40.596378] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:27.100 [2024-07-15 12:05:40.596413] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:27.100 [2024-07-15 12:05:40.596507] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:27.100 [2024-07-15 12:05:40.596518] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2955090 name raid_bdev1, state offline 00:24:27.100 0 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 1562376 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@948 -- # '[' -z 1562376 ']' 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@952 -- # kill -0 1562376 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # uname 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1562376 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1562376' 00:24:27.100 killing process with pid 1562376 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # kill 1562376 00:24:27.100 [2024-07-15 12:05:40.677463] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:27.100 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@972 -- # wait 1562376 00:24:27.360 [2024-07-15 12:05:40.710112] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.CO36ZkCOBz 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:24:27.360 00:24:27.360 real 0m10.701s 00:24:27.360 user 0m17.889s 00:24:27.360 sys 0m1.757s 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:27.360 12:05:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:24:27.360 ************************************ 00:24:27.360 END TEST raid_write_error_test 00:24:27.360 ************************************ 00:24:27.619 12:05:40 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:27.619 12:05:40 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:24:27.619 12:05:40 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:24:27.619 12:05:40 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:24:27.619 12:05:40 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:27.619 12:05:40 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:27.619 12:05:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:27.619 ************************************ 00:24:27.619 START TEST raid_rebuild_test 00:24:27.619 ************************************ 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false false true 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1563860 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1563860 /var/tmp/spdk-raid.sock 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1563860 ']' 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:27.619 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:27.619 12:05:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:27.619 [2024-07-15 12:05:41.100234] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:24:27.619 [2024-07-15 12:05:41.100295] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1563860 ] 00:24:27.619 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:27.620 Zero copy mechanism will not be used. 00:24:27.880 [2024-07-15 12:05:41.216724] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:27.880 [2024-07-15 12:05:41.314268] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:27.880 [2024-07-15 12:05:41.377809] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:27.880 [2024-07-15 12:05:41.377846] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:28.456 12:05:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:28.456 12:05:42 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:24:28.456 12:05:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:28.456 12:05:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:29.023 BaseBdev1_malloc 00:24:29.023 12:05:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:29.589 [2024-07-15 12:05:43.031748] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:29.589 [2024-07-15 12:05:43.031799] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:29.589 [2024-07-15 12:05:43.031822] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x254a9c0 00:24:29.589 [2024-07-15 12:05:43.031835] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:29.589 [2024-07-15 12:05:43.033575] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:29.589 [2024-07-15 12:05:43.033604] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:29.589 BaseBdev1 00:24:29.589 12:05:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:29.589 12:05:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:30.154 BaseBdev2_malloc 00:24:30.154 12:05:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:30.721 [2024-07-15 12:05:44.059087] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:30.721 [2024-07-15 12:05:44.059134] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:30.721 [2024-07-15 12:05:44.059156] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x254b510 00:24:30.721 [2024-07-15 12:05:44.059169] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:30.721 [2024-07-15 12:05:44.060699] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:30.721 [2024-07-15 12:05:44.060727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:30.721 BaseBdev2 00:24:30.721 12:05:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:31.288 spare_malloc 00:24:31.288 12:05:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:31.288 spare_delay 00:24:31.288 12:05:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:31.856 [2024-07-15 12:05:45.356262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:31.856 [2024-07-15 12:05:45.356310] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:31.856 [2024-07-15 12:05:45.356330] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f2520 00:24:31.856 [2024-07-15 12:05:45.356343] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:31.856 [2024-07-15 12:05:45.357940] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:31.856 [2024-07-15 12:05:45.357968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:31.856 spare 00:24:31.856 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:32.424 [2024-07-15 12:05:45.869611] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:32.424 [2024-07-15 12:05:45.870971] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:32.424 [2024-07-15 12:05:45.871049] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26f3ab0 00:24:32.424 [2024-07-15 12:05:45.871060] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:32.424 [2024-07-15 12:05:45.871268] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x254a5c0 00:24:32.424 [2024-07-15 12:05:45.871416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26f3ab0 00:24:32.424 [2024-07-15 12:05:45.871426] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26f3ab0 00:24:32.424 [2024-07-15 12:05:45.871547] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.424 12:05:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.683 12:05:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:32.683 "name": "raid_bdev1", 00:24:32.684 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:32.684 "strip_size_kb": 0, 00:24:32.684 "state": "online", 00:24:32.684 "raid_level": "raid1", 00:24:32.684 "superblock": false, 00:24:32.684 "num_base_bdevs": 2, 00:24:32.684 "num_base_bdevs_discovered": 2, 00:24:32.684 "num_base_bdevs_operational": 2, 00:24:32.684 "base_bdevs_list": [ 00:24:32.684 { 00:24:32.684 "name": "BaseBdev1", 00:24:32.684 "uuid": "a957baad-dc41-508e-9f59-c6c88281ad3d", 00:24:32.684 "is_configured": true, 00:24:32.684 "data_offset": 0, 00:24:32.684 "data_size": 65536 00:24:32.684 }, 00:24:32.684 { 00:24:32.684 "name": "BaseBdev2", 00:24:32.684 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:32.684 "is_configured": true, 00:24:32.684 "data_offset": 0, 00:24:32.684 "data_size": 65536 00:24:32.684 } 00:24:32.684 ] 00:24:32.684 }' 00:24:32.684 12:05:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:32.684 12:05:46 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:33.621 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:33.621 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:33.621 [2024-07-15 12:05:47.157315] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:33.621 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:33.621 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.621 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:33.880 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:34.448 [2024-07-15 12:05:47.915111] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26f35c0 00:24:34.448 /dev/nbd0 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:34.448 1+0 records in 00:24:34.448 1+0 records out 00:24:34.448 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253315 s, 16.2 MB/s 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:34.448 12:05:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:41.018 65536+0 records in 00:24:41.018 65536+0 records out 00:24:41.018 33554432 bytes (34 MB, 32 MiB) copied, 6.22357 s, 5.4 MB/s 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:41.018 [2024-07-15 12:05:54.482003] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:41.018 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:41.276 [2024-07-15 12:05:54.710632] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.276 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.535 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.535 "name": "raid_bdev1", 00:24:41.535 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:41.535 "strip_size_kb": 0, 00:24:41.535 "state": "online", 00:24:41.535 "raid_level": "raid1", 00:24:41.535 "superblock": false, 00:24:41.535 "num_base_bdevs": 2, 00:24:41.535 "num_base_bdevs_discovered": 1, 00:24:41.535 "num_base_bdevs_operational": 1, 00:24:41.535 "base_bdevs_list": [ 00:24:41.535 { 00:24:41.535 "name": null, 00:24:41.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.535 "is_configured": false, 00:24:41.535 "data_offset": 0, 00:24:41.535 "data_size": 65536 00:24:41.535 }, 00:24:41.535 { 00:24:41.535 "name": "BaseBdev2", 00:24:41.535 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:41.535 "is_configured": true, 00:24:41.535 "data_offset": 0, 00:24:41.535 "data_size": 65536 00:24:41.535 } 00:24:41.535 ] 00:24:41.535 }' 00:24:41.535 12:05:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.535 12:05:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:42.102 12:05:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:42.360 [2024-07-15 12:05:55.821585] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:42.360 [2024-07-15 12:05:55.826495] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2252230 00:24:42.360 [2024-07-15 12:05:55.828729] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:42.360 12:05:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:43.294 12:05:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:43.294 12:05:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:43.294 12:05:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:43.294 12:05:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:43.294 12:05:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:43.294 12:05:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.294 12:05:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.552 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:43.552 "name": "raid_bdev1", 00:24:43.552 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:43.552 "strip_size_kb": 0, 00:24:43.552 "state": "online", 00:24:43.552 "raid_level": "raid1", 00:24:43.552 "superblock": false, 00:24:43.552 "num_base_bdevs": 2, 00:24:43.552 "num_base_bdevs_discovered": 2, 00:24:43.553 "num_base_bdevs_operational": 2, 00:24:43.553 "process": { 00:24:43.553 "type": "rebuild", 00:24:43.553 "target": "spare", 00:24:43.553 "progress": { 00:24:43.553 "blocks": 24576, 00:24:43.553 "percent": 37 00:24:43.553 } 00:24:43.553 }, 00:24:43.553 "base_bdevs_list": [ 00:24:43.553 { 00:24:43.553 "name": "spare", 00:24:43.553 "uuid": "0274f997-208a-59b4-8264-8721ab770b24", 00:24:43.553 "is_configured": true, 00:24:43.553 "data_offset": 0, 00:24:43.553 "data_size": 65536 00:24:43.553 }, 00:24:43.553 { 00:24:43.553 "name": "BaseBdev2", 00:24:43.553 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:43.553 "is_configured": true, 00:24:43.553 "data_offset": 0, 00:24:43.553 "data_size": 65536 00:24:43.553 } 00:24:43.553 ] 00:24:43.553 }' 00:24:43.553 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:43.812 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:43.812 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:43.812 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:43.812 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:43.812 [2024-07-15 12:05:57.342754] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:44.071 [2024-07-15 12:05:57.440769] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:44.071 [2024-07-15 12:05:57.440816] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.071 [2024-07-15 12:05:57.440831] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:44.071 [2024-07-15 12:05:57.440840] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.071 "name": "raid_bdev1", 00:24:44.071 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:44.071 "strip_size_kb": 0, 00:24:44.071 "state": "online", 00:24:44.071 "raid_level": "raid1", 00:24:44.071 "superblock": false, 00:24:44.071 "num_base_bdevs": 2, 00:24:44.071 "num_base_bdevs_discovered": 1, 00:24:44.071 "num_base_bdevs_operational": 1, 00:24:44.071 "base_bdevs_list": [ 00:24:44.071 { 00:24:44.071 "name": null, 00:24:44.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.071 "is_configured": false, 00:24:44.071 "data_offset": 0, 00:24:44.071 "data_size": 65536 00:24:44.071 }, 00:24:44.071 { 00:24:44.071 "name": "BaseBdev2", 00:24:44.071 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:44.071 "is_configured": true, 00:24:44.071 "data_offset": 0, 00:24:44.071 "data_size": 65536 00:24:44.071 } 00:24:44.071 ] 00:24:44.071 }' 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.071 12:05:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.007 "name": "raid_bdev1", 00:24:45.007 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:45.007 "strip_size_kb": 0, 00:24:45.007 "state": "online", 00:24:45.007 "raid_level": "raid1", 00:24:45.007 "superblock": false, 00:24:45.007 "num_base_bdevs": 2, 00:24:45.007 "num_base_bdevs_discovered": 1, 00:24:45.007 "num_base_bdevs_operational": 1, 00:24:45.007 "base_bdevs_list": [ 00:24:45.007 { 00:24:45.007 "name": null, 00:24:45.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.007 "is_configured": false, 00:24:45.007 "data_offset": 0, 00:24:45.007 "data_size": 65536 00:24:45.007 }, 00:24:45.007 { 00:24:45.007 "name": "BaseBdev2", 00:24:45.007 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:45.007 "is_configured": true, 00:24:45.007 "data_offset": 0, 00:24:45.007 "data_size": 65536 00:24:45.007 } 00:24:45.007 ] 00:24:45.007 }' 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:45.007 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:45.266 [2024-07-15 12:05:58.812772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:45.266 [2024-07-15 12:05:58.817872] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26f39b0 00:24:45.266 [2024-07-15 12:05:58.819349] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:45.266 12:05:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:46.644 12:05:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:46.644 12:05:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.644 12:05:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:46.644 12:05:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:46.644 12:05:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.644 12:05:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.644 12:05:59 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:46.644 "name": "raid_bdev1", 00:24:46.644 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:46.644 "strip_size_kb": 0, 00:24:46.644 "state": "online", 00:24:46.644 "raid_level": "raid1", 00:24:46.644 "superblock": false, 00:24:46.644 "num_base_bdevs": 2, 00:24:46.644 "num_base_bdevs_discovered": 2, 00:24:46.644 "num_base_bdevs_operational": 2, 00:24:46.644 "process": { 00:24:46.644 "type": "rebuild", 00:24:46.644 "target": "spare", 00:24:46.644 "progress": { 00:24:46.644 "blocks": 24576, 00:24:46.644 "percent": 37 00:24:46.644 } 00:24:46.644 }, 00:24:46.644 "base_bdevs_list": [ 00:24:46.644 { 00:24:46.644 "name": "spare", 00:24:46.644 "uuid": "0274f997-208a-59b4-8264-8721ab770b24", 00:24:46.644 "is_configured": true, 00:24:46.644 "data_offset": 0, 00:24:46.644 "data_size": 65536 00:24:46.644 }, 00:24:46.644 { 00:24:46.644 "name": "BaseBdev2", 00:24:46.644 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:46.644 "is_configured": true, 00:24:46.644 "data_offset": 0, 00:24:46.644 "data_size": 65536 00:24:46.644 } 00:24:46.644 ] 00:24:46.644 }' 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:24:46.644 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=794 00:24:46.645 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:46.645 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:46.645 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.645 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:46.645 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:46.645 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.645 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.645 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.903 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:46.903 "name": "raid_bdev1", 00:24:46.903 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:46.903 "strip_size_kb": 0, 00:24:46.903 "state": "online", 00:24:46.903 "raid_level": "raid1", 00:24:46.903 "superblock": false, 00:24:46.903 "num_base_bdevs": 2, 00:24:46.903 "num_base_bdevs_discovered": 2, 00:24:46.903 "num_base_bdevs_operational": 2, 00:24:46.903 "process": { 00:24:46.903 "type": "rebuild", 00:24:46.903 "target": "spare", 00:24:46.903 "progress": { 00:24:46.903 "blocks": 30720, 00:24:46.903 "percent": 46 00:24:46.903 } 00:24:46.903 }, 00:24:46.903 "base_bdevs_list": [ 00:24:46.903 { 00:24:46.903 "name": "spare", 00:24:46.903 "uuid": "0274f997-208a-59b4-8264-8721ab770b24", 00:24:46.903 "is_configured": true, 00:24:46.903 "data_offset": 0, 00:24:46.903 "data_size": 65536 00:24:46.903 }, 00:24:46.903 { 00:24:46.903 "name": "BaseBdev2", 00:24:46.903 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:46.903 "is_configured": true, 00:24:46.903 "data_offset": 0, 00:24:46.903 "data_size": 65536 00:24:46.903 } 00:24:46.903 ] 00:24:46.903 }' 00:24:46.903 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:46.903 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:46.903 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.162 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:47.162 12:06:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:48.099 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:48.099 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:48.099 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:48.099 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:48.099 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:48.099 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:48.099 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.099 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.358 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.358 "name": "raid_bdev1", 00:24:48.358 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:48.358 "strip_size_kb": 0, 00:24:48.358 "state": "online", 00:24:48.358 "raid_level": "raid1", 00:24:48.358 "superblock": false, 00:24:48.358 "num_base_bdevs": 2, 00:24:48.358 "num_base_bdevs_discovered": 2, 00:24:48.358 "num_base_bdevs_operational": 2, 00:24:48.358 "process": { 00:24:48.358 "type": "rebuild", 00:24:48.358 "target": "spare", 00:24:48.358 "progress": { 00:24:48.358 "blocks": 59392, 00:24:48.358 "percent": 90 00:24:48.358 } 00:24:48.358 }, 00:24:48.358 "base_bdevs_list": [ 00:24:48.358 { 00:24:48.358 "name": "spare", 00:24:48.358 "uuid": "0274f997-208a-59b4-8264-8721ab770b24", 00:24:48.358 "is_configured": true, 00:24:48.358 "data_offset": 0, 00:24:48.358 "data_size": 65536 00:24:48.358 }, 00:24:48.358 { 00:24:48.358 "name": "BaseBdev2", 00:24:48.358 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:48.358 "is_configured": true, 00:24:48.358 "data_offset": 0, 00:24:48.358 "data_size": 65536 00:24:48.358 } 00:24:48.358 ] 00:24:48.358 }' 00:24:48.358 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.358 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:48.358 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.358 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:48.358 12:06:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:48.617 [2024-07-15 12:06:02.044503] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:48.617 [2024-07-15 12:06:02.044559] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:48.617 [2024-07-15 12:06:02.044594] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:49.555 12:06:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:49.555 12:06:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:49.555 12:06:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.555 12:06:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:49.555 12:06:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:49.555 12:06:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.555 12:06:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.555 12:06:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.555 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:49.555 "name": "raid_bdev1", 00:24:49.555 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:49.555 "strip_size_kb": 0, 00:24:49.555 "state": "online", 00:24:49.555 "raid_level": "raid1", 00:24:49.555 "superblock": false, 00:24:49.555 "num_base_bdevs": 2, 00:24:49.555 "num_base_bdevs_discovered": 2, 00:24:49.555 "num_base_bdevs_operational": 2, 00:24:49.555 "base_bdevs_list": [ 00:24:49.555 { 00:24:49.555 "name": "spare", 00:24:49.555 "uuid": "0274f997-208a-59b4-8264-8721ab770b24", 00:24:49.555 "is_configured": true, 00:24:49.555 "data_offset": 0, 00:24:49.555 "data_size": 65536 00:24:49.555 }, 00:24:49.555 { 00:24:49.555 "name": "BaseBdev2", 00:24:49.555 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:49.555 "is_configured": true, 00:24:49.555 "data_offset": 0, 00:24:49.555 "data_size": 65536 00:24:49.555 } 00:24:49.555 ] 00:24:49.555 }' 00:24:49.555 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.814 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:50.072 "name": "raid_bdev1", 00:24:50.072 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:50.072 "strip_size_kb": 0, 00:24:50.072 "state": "online", 00:24:50.072 "raid_level": "raid1", 00:24:50.072 "superblock": false, 00:24:50.072 "num_base_bdevs": 2, 00:24:50.072 "num_base_bdevs_discovered": 2, 00:24:50.072 "num_base_bdevs_operational": 2, 00:24:50.072 "base_bdevs_list": [ 00:24:50.072 { 00:24:50.072 "name": "spare", 00:24:50.072 "uuid": "0274f997-208a-59b4-8264-8721ab770b24", 00:24:50.072 "is_configured": true, 00:24:50.072 "data_offset": 0, 00:24:50.072 "data_size": 65536 00:24:50.072 }, 00:24:50.072 { 00:24:50.072 "name": "BaseBdev2", 00:24:50.072 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:50.072 "is_configured": true, 00:24:50.072 "data_offset": 0, 00:24:50.072 "data_size": 65536 00:24:50.072 } 00:24:50.072 ] 00:24:50.072 }' 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.072 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:50.331 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:50.331 "name": "raid_bdev1", 00:24:50.331 "uuid": "95ac6be7-9b56-4ad9-adc9-1ffa17d857ba", 00:24:50.331 "strip_size_kb": 0, 00:24:50.331 "state": "online", 00:24:50.331 "raid_level": "raid1", 00:24:50.331 "superblock": false, 00:24:50.331 "num_base_bdevs": 2, 00:24:50.331 "num_base_bdevs_discovered": 2, 00:24:50.331 "num_base_bdevs_operational": 2, 00:24:50.331 "base_bdevs_list": [ 00:24:50.331 { 00:24:50.331 "name": "spare", 00:24:50.331 "uuid": "0274f997-208a-59b4-8264-8721ab770b24", 00:24:50.331 "is_configured": true, 00:24:50.331 "data_offset": 0, 00:24:50.331 "data_size": 65536 00:24:50.331 }, 00:24:50.331 { 00:24:50.331 "name": "BaseBdev2", 00:24:50.331 "uuid": "8629f5d9-3004-50c3-9aad-bf43f05cee8f", 00:24:50.331 "is_configured": true, 00:24:50.331 "data_offset": 0, 00:24:50.331 "data_size": 65536 00:24:50.331 } 00:24:50.331 ] 00:24:50.331 }' 00:24:50.331 12:06:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:50.331 12:06:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:50.898 12:06:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:51.155 [2024-07-15 12:06:04.664146] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:51.155 [2024-07-15 12:06:04.664174] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:51.155 [2024-07-15 12:06:04.664231] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:51.155 [2024-07-15 12:06:04.664287] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:51.155 [2024-07-15 12:06:04.664299] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26f3ab0 name raid_bdev1, state offline 00:24:51.155 12:06:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.155 12:06:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:51.413 12:06:04 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:51.698 /dev/nbd0 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:51.698 1+0 records in 00:24:51.698 1+0 records out 00:24:51.698 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263931 s, 15.5 MB/s 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:51.698 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:51.956 /dev/nbd1 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:51.956 1+0 records in 00:24:51.956 1+0 records out 00:24:51.956 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295693 s, 13.9 MB/s 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:51.956 12:06:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:52.215 12:06:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:52.215 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:52.215 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:52.215 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:52.215 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:52.215 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:52.215 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:52.478 12:06:05 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1563860 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1563860 ']' 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1563860 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1563860 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1563860' 00:24:52.771 killing process with pid 1563860 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1563860 00:24:52.771 Received shutdown signal, test time was about 60.000000 seconds 00:24:52.771 00:24:52.771 Latency(us) 00:24:52.771 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:52.771 =================================================================================================================== 00:24:52.771 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:52.771 [2024-07-15 12:06:06.151968] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:52.771 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1563860 00:24:52.771 [2024-07-15 12:06:06.180432] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:53.037 12:06:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:24:53.038 00:24:53.038 real 0m25.363s 00:24:53.038 user 0m34.591s 00:24:53.038 sys 0m5.610s 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:53.038 ************************************ 00:24:53.038 END TEST raid_rebuild_test 00:24:53.038 ************************************ 00:24:53.038 12:06:06 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:24:53.038 12:06:06 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:24:53.038 12:06:06 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:24:53.038 12:06:06 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:24:53.038 12:06:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:53.038 ************************************ 00:24:53.038 START TEST raid_rebuild_test_sb 00:24:53.038 ************************************ 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1567916 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1567916 /var/tmp/spdk-raid.sock 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1567916 ']' 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:53.038 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:24:53.038 12:06:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:53.038 [2024-07-15 12:06:06.540778] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:24:53.038 [2024-07-15 12:06:06.540844] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1567916 ] 00:24:53.038 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:53.038 Zero copy mechanism will not be used. 00:24:53.297 [2024-07-15 12:06:06.666769] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:53.297 [2024-07-15 12:06:06.771563] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:53.297 [2024-07-15 12:06:06.834744] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:53.297 [2024-07-15 12:06:06.834776] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:54.234 12:06:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:24:54.234 12:06:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:24:54.234 12:06:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:54.234 12:06:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:54.234 BaseBdev1_malloc 00:24:54.234 12:06:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:54.494 [2024-07-15 12:06:07.955157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:54.494 [2024-07-15 12:06:07.955211] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:54.494 [2024-07-15 12:06:07.955232] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b569c0 00:24:54.494 [2024-07-15 12:06:07.955245] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:54.494 [2024-07-15 12:06:07.956794] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:54.494 [2024-07-15 12:06:07.956820] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:54.494 BaseBdev1 00:24:54.494 12:06:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:54.494 12:06:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:54.753 BaseBdev2_malloc 00:24:54.753 12:06:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:55.012 [2024-07-15 12:06:08.465260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:55.012 [2024-07-15 12:06:08.465315] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:55.012 [2024-07-15 12:06:08.465338] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b57510 00:24:55.012 [2024-07-15 12:06:08.465350] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:55.012 [2024-07-15 12:06:08.466790] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:55.012 [2024-07-15 12:06:08.466818] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:55.012 BaseBdev2 00:24:55.012 12:06:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:55.271 spare_malloc 00:24:55.271 12:06:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:55.530 spare_delay 00:24:55.530 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:55.789 [2024-07-15 12:06:09.231900] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:55.789 [2024-07-15 12:06:09.231948] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:55.789 [2024-07-15 12:06:09.231967] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cfe520 00:24:55.789 [2024-07-15 12:06:09.231979] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:55.789 [2024-07-15 12:06:09.233474] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:55.789 [2024-07-15 12:06:09.233503] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:55.789 spare 00:24:55.789 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:24:56.048 [2024-07-15 12:06:09.488607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:56.048 [2024-07-15 12:06:09.489837] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:56.048 [2024-07-15 12:06:09.489994] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cffab0 00:24:56.048 [2024-07-15 12:06:09.490007] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:56.048 [2024-07-15 12:06:09.490204] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b565c0 00:24:56.048 [2024-07-15 12:06:09.490345] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cffab0 00:24:56.048 [2024-07-15 12:06:09.490355] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cffab0 00:24:56.048 [2024-07-15 12:06:09.490448] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.048 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.307 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:56.307 "name": "raid_bdev1", 00:24:56.307 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:24:56.307 "strip_size_kb": 0, 00:24:56.307 "state": "online", 00:24:56.307 "raid_level": "raid1", 00:24:56.307 "superblock": true, 00:24:56.307 "num_base_bdevs": 2, 00:24:56.307 "num_base_bdevs_discovered": 2, 00:24:56.307 "num_base_bdevs_operational": 2, 00:24:56.307 "base_bdevs_list": [ 00:24:56.307 { 00:24:56.307 "name": "BaseBdev1", 00:24:56.307 "uuid": "f8102b91-324e-5117-93d0-5fe63c1215e1", 00:24:56.307 "is_configured": true, 00:24:56.307 "data_offset": 2048, 00:24:56.307 "data_size": 63488 00:24:56.307 }, 00:24:56.307 { 00:24:56.307 "name": "BaseBdev2", 00:24:56.307 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:24:56.307 "is_configured": true, 00:24:56.307 "data_offset": 2048, 00:24:56.307 "data_size": 63488 00:24:56.307 } 00:24:56.307 ] 00:24:56.307 }' 00:24:56.307 12:06:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:56.307 12:06:09 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:56.875 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:56.875 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:57.134 [2024-07-15 12:06:10.591769] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:57.134 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:57.134 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.134 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:57.393 12:06:10 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:57.653 [2024-07-15 12:06:11.088858] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b565c0 00:24:57.653 /dev/nbd0 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:57.653 1+0 records in 00:24:57.653 1+0 records out 00:24:57.653 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000260248 s, 15.7 MB/s 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:24:57.653 12:06:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:25:04.226 63488+0 records in 00:25:04.226 63488+0 records out 00:25:04.226 32505856 bytes (33 MB, 31 MiB) copied, 5.89051 s, 5.5 MB/s 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:04.226 [2024-07-15 12:06:17.317983] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:04.226 [2024-07-15 12:06:17.554659] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.226 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:04.484 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:04.484 "name": "raid_bdev1", 00:25:04.484 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:04.484 "strip_size_kb": 0, 00:25:04.484 "state": "online", 00:25:04.484 "raid_level": "raid1", 00:25:04.484 "superblock": true, 00:25:04.484 "num_base_bdevs": 2, 00:25:04.484 "num_base_bdevs_discovered": 1, 00:25:04.484 "num_base_bdevs_operational": 1, 00:25:04.484 "base_bdevs_list": [ 00:25:04.484 { 00:25:04.484 "name": null, 00:25:04.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:04.484 "is_configured": false, 00:25:04.484 "data_offset": 2048, 00:25:04.484 "data_size": 63488 00:25:04.484 }, 00:25:04.484 { 00:25:04.484 "name": "BaseBdev2", 00:25:04.484 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:04.484 "is_configured": true, 00:25:04.484 "data_offset": 2048, 00:25:04.484 "data_size": 63488 00:25:04.484 } 00:25:04.484 ] 00:25:04.484 }' 00:25:04.484 12:06:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:04.484 12:06:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:05.051 12:06:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:05.051 [2024-07-15 12:06:18.637539] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:05.051 [2024-07-15 12:06:18.642577] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cfb7d0 00:25:05.051 [2024-07-15 12:06:18.644809] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:05.310 12:06:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:06.247 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:06.247 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:06.247 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:06.247 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:06.247 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:06.247 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.247 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:06.506 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:06.506 "name": "raid_bdev1", 00:25:06.506 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:06.506 "strip_size_kb": 0, 00:25:06.506 "state": "online", 00:25:06.506 "raid_level": "raid1", 00:25:06.506 "superblock": true, 00:25:06.506 "num_base_bdevs": 2, 00:25:06.507 "num_base_bdevs_discovered": 2, 00:25:06.507 "num_base_bdevs_operational": 2, 00:25:06.507 "process": { 00:25:06.507 "type": "rebuild", 00:25:06.507 "target": "spare", 00:25:06.507 "progress": { 00:25:06.507 "blocks": 24576, 00:25:06.507 "percent": 38 00:25:06.507 } 00:25:06.507 }, 00:25:06.507 "base_bdevs_list": [ 00:25:06.507 { 00:25:06.507 "name": "spare", 00:25:06.507 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:06.507 "is_configured": true, 00:25:06.507 "data_offset": 2048, 00:25:06.507 "data_size": 63488 00:25:06.507 }, 00:25:06.507 { 00:25:06.507 "name": "BaseBdev2", 00:25:06.507 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:06.507 "is_configured": true, 00:25:06.507 "data_offset": 2048, 00:25:06.507 "data_size": 63488 00:25:06.507 } 00:25:06.507 ] 00:25:06.507 }' 00:25:06.507 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:06.507 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:06.507 12:06:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:06.507 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:06.507 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:06.766 [2024-07-15 12:06:20.243214] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:06.766 [2024-07-15 12:06:20.257571] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:06.766 [2024-07-15 12:06:20.257614] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:06.766 [2024-07-15 12:06:20.257628] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:06.766 [2024-07-15 12:06:20.257637] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:06.766 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.025 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.025 "name": "raid_bdev1", 00:25:07.025 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:07.025 "strip_size_kb": 0, 00:25:07.025 "state": "online", 00:25:07.025 "raid_level": "raid1", 00:25:07.025 "superblock": true, 00:25:07.025 "num_base_bdevs": 2, 00:25:07.025 "num_base_bdevs_discovered": 1, 00:25:07.025 "num_base_bdevs_operational": 1, 00:25:07.025 "base_bdevs_list": [ 00:25:07.025 { 00:25:07.025 "name": null, 00:25:07.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.025 "is_configured": false, 00:25:07.025 "data_offset": 2048, 00:25:07.025 "data_size": 63488 00:25:07.025 }, 00:25:07.025 { 00:25:07.025 "name": "BaseBdev2", 00:25:07.025 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:07.025 "is_configured": true, 00:25:07.025 "data_offset": 2048, 00:25:07.025 "data_size": 63488 00:25:07.025 } 00:25:07.025 ] 00:25:07.025 }' 00:25:07.025 12:06:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.025 12:06:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:07.592 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:07.592 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:07.592 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:07.592 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:07.592 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:07.592 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.592 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.850 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:07.850 "name": "raid_bdev1", 00:25:07.850 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:07.850 "strip_size_kb": 0, 00:25:07.850 "state": "online", 00:25:07.850 "raid_level": "raid1", 00:25:07.850 "superblock": true, 00:25:07.850 "num_base_bdevs": 2, 00:25:07.850 "num_base_bdevs_discovered": 1, 00:25:07.850 "num_base_bdevs_operational": 1, 00:25:07.850 "base_bdevs_list": [ 00:25:07.850 { 00:25:07.850 "name": null, 00:25:07.850 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.850 "is_configured": false, 00:25:07.850 "data_offset": 2048, 00:25:07.850 "data_size": 63488 00:25:07.850 }, 00:25:07.850 { 00:25:07.850 "name": "BaseBdev2", 00:25:07.850 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:07.850 "is_configured": true, 00:25:07.851 "data_offset": 2048, 00:25:07.851 "data_size": 63488 00:25:07.851 } 00:25:07.851 ] 00:25:07.851 }' 00:25:07.851 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:07.851 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:07.851 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:08.109 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:08.109 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:08.109 [2024-07-15 12:06:21.686574] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:08.109 [2024-07-15 12:06:21.692220] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b55fb0 00:25:08.109 [2024-07-15 12:06:21.693744] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:08.368 12:06:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:09.305 12:06:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.305 12:06:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.305 12:06:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.305 12:06:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.305 12:06:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.305 12:06:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.305 12:06:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.564 12:06:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.564 "name": "raid_bdev1", 00:25:09.564 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:09.564 "strip_size_kb": 0, 00:25:09.564 "state": "online", 00:25:09.564 "raid_level": "raid1", 00:25:09.564 "superblock": true, 00:25:09.564 "num_base_bdevs": 2, 00:25:09.564 "num_base_bdevs_discovered": 2, 00:25:09.564 "num_base_bdevs_operational": 2, 00:25:09.564 "process": { 00:25:09.564 "type": "rebuild", 00:25:09.564 "target": "spare", 00:25:09.564 "progress": { 00:25:09.564 "blocks": 24576, 00:25:09.564 "percent": 38 00:25:09.564 } 00:25:09.564 }, 00:25:09.564 "base_bdevs_list": [ 00:25:09.564 { 00:25:09.564 "name": "spare", 00:25:09.564 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:09.564 "is_configured": true, 00:25:09.564 "data_offset": 2048, 00:25:09.564 "data_size": 63488 00:25:09.564 }, 00:25:09.564 { 00:25:09.564 "name": "BaseBdev2", 00:25:09.564 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:09.564 "is_configured": true, 00:25:09.564 "data_offset": 2048, 00:25:09.564 "data_size": 63488 00:25:09.564 } 00:25:09.564 ] 00:25:09.564 }' 00:25:09.564 12:06:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:09.564 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=817 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.564 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:09.824 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:09.824 "name": "raid_bdev1", 00:25:09.824 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:09.824 "strip_size_kb": 0, 00:25:09.824 "state": "online", 00:25:09.824 "raid_level": "raid1", 00:25:09.824 "superblock": true, 00:25:09.824 "num_base_bdevs": 2, 00:25:09.824 "num_base_bdevs_discovered": 2, 00:25:09.824 "num_base_bdevs_operational": 2, 00:25:09.824 "process": { 00:25:09.824 "type": "rebuild", 00:25:09.824 "target": "spare", 00:25:09.824 "progress": { 00:25:09.824 "blocks": 32768, 00:25:09.824 "percent": 51 00:25:09.824 } 00:25:09.824 }, 00:25:09.824 "base_bdevs_list": [ 00:25:09.824 { 00:25:09.824 "name": "spare", 00:25:09.824 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:09.824 "is_configured": true, 00:25:09.824 "data_offset": 2048, 00:25:09.824 "data_size": 63488 00:25:09.824 }, 00:25:09.824 { 00:25:09.824 "name": "BaseBdev2", 00:25:09.824 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:09.824 "is_configured": true, 00:25:09.824 "data_offset": 2048, 00:25:09.824 "data_size": 63488 00:25:09.824 } 00:25:09.824 ] 00:25:09.824 }' 00:25:09.824 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:09.824 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:09.824 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:10.081 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:10.081 12:06:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:11.018 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:11.018 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:11.018 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:11.018 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:11.018 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:11.018 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:11.018 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.018 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.277 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:11.277 "name": "raid_bdev1", 00:25:11.277 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:11.277 "strip_size_kb": 0, 00:25:11.277 "state": "online", 00:25:11.277 "raid_level": "raid1", 00:25:11.277 "superblock": true, 00:25:11.277 "num_base_bdevs": 2, 00:25:11.277 "num_base_bdevs_discovered": 2, 00:25:11.277 "num_base_bdevs_operational": 2, 00:25:11.277 "process": { 00:25:11.277 "type": "rebuild", 00:25:11.277 "target": "spare", 00:25:11.277 "progress": { 00:25:11.277 "blocks": 59392, 00:25:11.277 "percent": 93 00:25:11.277 } 00:25:11.277 }, 00:25:11.277 "base_bdevs_list": [ 00:25:11.277 { 00:25:11.277 "name": "spare", 00:25:11.277 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:11.277 "is_configured": true, 00:25:11.277 "data_offset": 2048, 00:25:11.277 "data_size": 63488 00:25:11.277 }, 00:25:11.277 { 00:25:11.277 "name": "BaseBdev2", 00:25:11.277 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:11.277 "is_configured": true, 00:25:11.277 "data_offset": 2048, 00:25:11.277 "data_size": 63488 00:25:11.277 } 00:25:11.277 ] 00:25:11.277 }' 00:25:11.277 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:11.277 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:11.277 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:11.277 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:11.277 12:06:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:11.277 [2024-07-15 12:06:24.818397] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:11.277 [2024-07-15 12:06:24.818464] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:11.277 [2024-07-15 12:06:24.818550] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:12.214 12:06:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:12.214 12:06:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:12.214 12:06:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:12.214 12:06:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:12.214 12:06:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:12.214 12:06:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:12.214 12:06:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.214 12:06:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:12.782 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:12.782 "name": "raid_bdev1", 00:25:12.782 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:12.782 "strip_size_kb": 0, 00:25:12.782 "state": "online", 00:25:12.782 "raid_level": "raid1", 00:25:12.782 "superblock": true, 00:25:12.782 "num_base_bdevs": 2, 00:25:12.782 "num_base_bdevs_discovered": 2, 00:25:12.782 "num_base_bdevs_operational": 2, 00:25:12.782 "base_bdevs_list": [ 00:25:12.782 { 00:25:12.782 "name": "spare", 00:25:12.782 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:12.782 "is_configured": true, 00:25:12.782 "data_offset": 2048, 00:25:12.782 "data_size": 63488 00:25:12.782 }, 00:25:12.782 { 00:25:12.782 "name": "BaseBdev2", 00:25:12.782 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:12.782 "is_configured": true, 00:25:12.782 "data_offset": 2048, 00:25:12.782 "data_size": 63488 00:25:12.782 } 00:25:12.782 ] 00:25:12.782 }' 00:25:12.782 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:12.782 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:12.782 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.040 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:13.040 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:25:13.040 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:13.040 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.040 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:13.040 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:13.040 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.040 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.040 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.298 "name": "raid_bdev1", 00:25:13.298 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:13.298 "strip_size_kb": 0, 00:25:13.298 "state": "online", 00:25:13.298 "raid_level": "raid1", 00:25:13.298 "superblock": true, 00:25:13.298 "num_base_bdevs": 2, 00:25:13.298 "num_base_bdevs_discovered": 2, 00:25:13.298 "num_base_bdevs_operational": 2, 00:25:13.298 "base_bdevs_list": [ 00:25:13.298 { 00:25:13.298 "name": "spare", 00:25:13.298 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:13.298 "is_configured": true, 00:25:13.298 "data_offset": 2048, 00:25:13.298 "data_size": 63488 00:25:13.298 }, 00:25:13.298 { 00:25:13.298 "name": "BaseBdev2", 00:25:13.298 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:13.298 "is_configured": true, 00:25:13.298 "data_offset": 2048, 00:25:13.298 "data_size": 63488 00:25:13.298 } 00:25:13.298 ] 00:25:13.298 }' 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.298 12:06:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.557 12:06:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:13.557 "name": "raid_bdev1", 00:25:13.557 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:13.557 "strip_size_kb": 0, 00:25:13.557 "state": "online", 00:25:13.557 "raid_level": "raid1", 00:25:13.557 "superblock": true, 00:25:13.557 "num_base_bdevs": 2, 00:25:13.557 "num_base_bdevs_discovered": 2, 00:25:13.557 "num_base_bdevs_operational": 2, 00:25:13.557 "base_bdevs_list": [ 00:25:13.557 { 00:25:13.557 "name": "spare", 00:25:13.557 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:13.557 "is_configured": true, 00:25:13.557 "data_offset": 2048, 00:25:13.557 "data_size": 63488 00:25:13.557 }, 00:25:13.557 { 00:25:13.557 "name": "BaseBdev2", 00:25:13.557 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:13.557 "is_configured": true, 00:25:13.557 "data_offset": 2048, 00:25:13.557 "data_size": 63488 00:25:13.557 } 00:25:13.557 ] 00:25:13.557 }' 00:25:13.557 12:06:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:13.557 12:06:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:14.125 12:06:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:14.383 [2024-07-15 12:06:27.847759] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:14.383 [2024-07-15 12:06:27.847795] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:14.383 [2024-07-15 12:06:27.847857] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:14.383 [2024-07-15 12:06:27.847914] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:14.383 [2024-07-15 12:06:27.847925] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cffab0 name raid_bdev1, state offline 00:25:14.383 12:06:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.383 12:06:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:14.643 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:14.902 /dev/nbd0 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:14.902 1+0 records in 00:25:14.902 1+0 records out 00:25:14.902 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269417 s, 15.2 MB/s 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:14.902 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:15.161 /dev/nbd1 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:15.161 1+0 records in 00:25:15.161 1+0 records out 00:25:15.161 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000312486 s, 13.1 MB/s 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:15.161 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:15.421 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:15.421 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:15.421 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:15.421 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:15.421 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:25:15.421 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:15.421 12:06:28 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:15.681 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:15.941 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:16.201 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:16.461 [2024-07-15 12:06:29.842636] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:16.461 [2024-07-15 12:06:29.842694] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:16.461 [2024-07-15 12:06:29.842716] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cfb5d0 00:25:16.461 [2024-07-15 12:06:29.842729] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:16.461 [2024-07-15 12:06:29.844379] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:16.461 [2024-07-15 12:06:29.844412] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:16.461 [2024-07-15 12:06:29.844493] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:16.461 [2024-07-15 12:06:29.844525] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:16.461 [2024-07-15 12:06:29.844630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:16.461 spare 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.461 12:06:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.461 [2024-07-15 12:06:29.944950] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b4fa40 00:25:16.461 [2024-07-15 12:06:29.944968] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:16.461 [2024-07-15 12:06:29.945173] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b4dd20 00:25:16.461 [2024-07-15 12:06:29.945322] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b4fa40 00:25:16.461 [2024-07-15 12:06:29.945332] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1b4fa40 00:25:16.461 [2024-07-15 12:06:29.945436] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:16.721 12:06:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:16.721 "name": "raid_bdev1", 00:25:16.721 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:16.721 "strip_size_kb": 0, 00:25:16.721 "state": "online", 00:25:16.721 "raid_level": "raid1", 00:25:16.721 "superblock": true, 00:25:16.721 "num_base_bdevs": 2, 00:25:16.721 "num_base_bdevs_discovered": 2, 00:25:16.721 "num_base_bdevs_operational": 2, 00:25:16.721 "base_bdevs_list": [ 00:25:16.721 { 00:25:16.721 "name": "spare", 00:25:16.721 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:16.721 "is_configured": true, 00:25:16.721 "data_offset": 2048, 00:25:16.721 "data_size": 63488 00:25:16.721 }, 00:25:16.721 { 00:25:16.721 "name": "BaseBdev2", 00:25:16.721 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:16.721 "is_configured": true, 00:25:16.721 "data_offset": 2048, 00:25:16.721 "data_size": 63488 00:25:16.721 } 00:25:16.721 ] 00:25:16.721 }' 00:25:16.721 12:06:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:16.721 12:06:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:17.289 12:06:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:17.289 12:06:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:17.289 12:06:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:17.289 12:06:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:17.289 12:06:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:17.289 12:06:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.289 12:06:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.547 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:17.547 "name": "raid_bdev1", 00:25:17.547 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:17.547 "strip_size_kb": 0, 00:25:17.547 "state": "online", 00:25:17.547 "raid_level": "raid1", 00:25:17.547 "superblock": true, 00:25:17.547 "num_base_bdevs": 2, 00:25:17.547 "num_base_bdevs_discovered": 2, 00:25:17.547 "num_base_bdevs_operational": 2, 00:25:17.547 "base_bdevs_list": [ 00:25:17.547 { 00:25:17.547 "name": "spare", 00:25:17.547 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:17.547 "is_configured": true, 00:25:17.547 "data_offset": 2048, 00:25:17.547 "data_size": 63488 00:25:17.547 }, 00:25:17.547 { 00:25:17.547 "name": "BaseBdev2", 00:25:17.547 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:17.547 "is_configured": true, 00:25:17.547 "data_offset": 2048, 00:25:17.547 "data_size": 63488 00:25:17.548 } 00:25:17.548 ] 00:25:17.548 }' 00:25:17.548 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:17.548 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:17.548 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:17.548 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:17.548 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.548 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:17.806 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:17.806 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:18.065 [2024-07-15 12:06:31.575355] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.065 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.377 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:18.377 "name": "raid_bdev1", 00:25:18.377 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:18.377 "strip_size_kb": 0, 00:25:18.377 "state": "online", 00:25:18.377 "raid_level": "raid1", 00:25:18.377 "superblock": true, 00:25:18.377 "num_base_bdevs": 2, 00:25:18.377 "num_base_bdevs_discovered": 1, 00:25:18.377 "num_base_bdevs_operational": 1, 00:25:18.377 "base_bdevs_list": [ 00:25:18.377 { 00:25:18.377 "name": null, 00:25:18.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.377 "is_configured": false, 00:25:18.377 "data_offset": 2048, 00:25:18.377 "data_size": 63488 00:25:18.377 }, 00:25:18.377 { 00:25:18.377 "name": "BaseBdev2", 00:25:18.377 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:18.377 "is_configured": true, 00:25:18.377 "data_offset": 2048, 00:25:18.377 "data_size": 63488 00:25:18.377 } 00:25:18.377 ] 00:25:18.377 }' 00:25:18.377 12:06:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:18.377 12:06:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:18.968 12:06:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:19.227 [2024-07-15 12:06:32.690323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:19.227 [2024-07-15 12:06:32.690500] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:19.227 [2024-07-15 12:06:32.690518] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:19.227 [2024-07-15 12:06:32.690549] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:19.227 [2024-07-15 12:06:32.696090] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cfbfd0 00:25:19.227 [2024-07-15 12:06:32.697481] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:19.227 12:06:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:20.165 12:06:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:20.165 12:06:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:20.165 12:06:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:20.165 12:06:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:20.165 12:06:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:20.165 12:06:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.165 12:06:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.423 12:06:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:20.423 "name": "raid_bdev1", 00:25:20.423 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:20.423 "strip_size_kb": 0, 00:25:20.423 "state": "online", 00:25:20.423 "raid_level": "raid1", 00:25:20.423 "superblock": true, 00:25:20.423 "num_base_bdevs": 2, 00:25:20.423 "num_base_bdevs_discovered": 2, 00:25:20.423 "num_base_bdevs_operational": 2, 00:25:20.423 "process": { 00:25:20.423 "type": "rebuild", 00:25:20.423 "target": "spare", 00:25:20.423 "progress": { 00:25:20.423 "blocks": 24576, 00:25:20.423 "percent": 38 00:25:20.423 } 00:25:20.423 }, 00:25:20.423 "base_bdevs_list": [ 00:25:20.423 { 00:25:20.423 "name": "spare", 00:25:20.423 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:20.423 "is_configured": true, 00:25:20.423 "data_offset": 2048, 00:25:20.423 "data_size": 63488 00:25:20.423 }, 00:25:20.423 { 00:25:20.423 "name": "BaseBdev2", 00:25:20.423 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:20.423 "is_configured": true, 00:25:20.423 "data_offset": 2048, 00:25:20.423 "data_size": 63488 00:25:20.423 } 00:25:20.423 ] 00:25:20.423 }' 00:25:20.423 12:06:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:20.423 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:20.423 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:20.682 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:20.682 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:20.941 [2024-07-15 12:06:34.282384] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:20.941 [2024-07-15 12:06:34.310076] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:20.941 [2024-07-15 12:06:34.310118] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:20.941 [2024-07-15 12:06:34.310133] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:20.941 [2024-07-15 12:06:34.310141] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.941 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:21.201 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:21.201 "name": "raid_bdev1", 00:25:21.201 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:21.201 "strip_size_kb": 0, 00:25:21.201 "state": "online", 00:25:21.201 "raid_level": "raid1", 00:25:21.201 "superblock": true, 00:25:21.201 "num_base_bdevs": 2, 00:25:21.201 "num_base_bdevs_discovered": 1, 00:25:21.201 "num_base_bdevs_operational": 1, 00:25:21.201 "base_bdevs_list": [ 00:25:21.201 { 00:25:21.201 "name": null, 00:25:21.201 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:21.201 "is_configured": false, 00:25:21.201 "data_offset": 2048, 00:25:21.201 "data_size": 63488 00:25:21.201 }, 00:25:21.201 { 00:25:21.201 "name": "BaseBdev2", 00:25:21.201 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:21.201 "is_configured": true, 00:25:21.201 "data_offset": 2048, 00:25:21.201 "data_size": 63488 00:25:21.201 } 00:25:21.201 ] 00:25:21.201 }' 00:25:21.201 12:06:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:21.201 12:06:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:21.768 12:06:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:22.027 [2024-07-15 12:06:35.405228] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:22.027 [2024-07-15 12:06:35.405282] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:22.027 [2024-07-15 12:06:35.405304] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b54300 00:25:22.027 [2024-07-15 12:06:35.405317] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:22.027 [2024-07-15 12:06:35.405706] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:22.027 [2024-07-15 12:06:35.405725] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:22.027 [2024-07-15 12:06:35.405807] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:22.027 [2024-07-15 12:06:35.405819] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:22.027 [2024-07-15 12:06:35.405830] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:22.027 [2024-07-15 12:06:35.405849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:22.027 [2024-07-15 12:06:35.410733] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cfe7b0 00:25:22.027 spare 00:25:22.027 [2024-07-15 12:06:35.412076] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:22.027 12:06:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:22.963 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:22.963 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:22.963 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:22.963 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:22.963 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:22.963 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:22.963 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.222 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:23.222 "name": "raid_bdev1", 00:25:23.222 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:23.222 "strip_size_kb": 0, 00:25:23.222 "state": "online", 00:25:23.222 "raid_level": "raid1", 00:25:23.222 "superblock": true, 00:25:23.222 "num_base_bdevs": 2, 00:25:23.222 "num_base_bdevs_discovered": 2, 00:25:23.222 "num_base_bdevs_operational": 2, 00:25:23.222 "process": { 00:25:23.222 "type": "rebuild", 00:25:23.222 "target": "spare", 00:25:23.222 "progress": { 00:25:23.222 "blocks": 22528, 00:25:23.222 "percent": 35 00:25:23.222 } 00:25:23.222 }, 00:25:23.222 "base_bdevs_list": [ 00:25:23.222 { 00:25:23.222 "name": "spare", 00:25:23.222 "uuid": "a4c6e218-36a6-576f-bb71-d6afa0fa66cd", 00:25:23.222 "is_configured": true, 00:25:23.222 "data_offset": 2048, 00:25:23.222 "data_size": 63488 00:25:23.222 }, 00:25:23.222 { 00:25:23.222 "name": "BaseBdev2", 00:25:23.222 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:23.222 "is_configured": true, 00:25:23.222 "data_offset": 2048, 00:25:23.222 "data_size": 63488 00:25:23.222 } 00:25:23.222 ] 00:25:23.222 }' 00:25:23.222 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:23.222 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:23.222 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:23.222 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:23.222 12:06:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:23.481 [2024-07-15 12:06:36.982897] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:23.481 [2024-07-15 12:06:37.024443] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:23.481 [2024-07-15 12:06:37.024485] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:23.481 [2024-07-15 12:06:37.024499] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:23.481 [2024-07-15 12:06:37.024508] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.481 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.740 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.740 "name": "raid_bdev1", 00:25:23.740 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:23.740 "strip_size_kb": 0, 00:25:23.740 "state": "online", 00:25:23.740 "raid_level": "raid1", 00:25:23.740 "superblock": true, 00:25:23.740 "num_base_bdevs": 2, 00:25:23.740 "num_base_bdevs_discovered": 1, 00:25:23.740 "num_base_bdevs_operational": 1, 00:25:23.740 "base_bdevs_list": [ 00:25:23.740 { 00:25:23.740 "name": null, 00:25:23.740 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.740 "is_configured": false, 00:25:23.740 "data_offset": 2048, 00:25:23.740 "data_size": 63488 00:25:23.740 }, 00:25:23.740 { 00:25:23.740 "name": "BaseBdev2", 00:25:23.740 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:23.740 "is_configured": true, 00:25:23.740 "data_offset": 2048, 00:25:23.740 "data_size": 63488 00:25:23.740 } 00:25:23.740 ] 00:25:23.740 }' 00:25:23.740 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.740 12:06:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:24.677 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:24.677 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:24.677 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:24.677 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:24.677 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:24.677 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.677 12:06:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.677 12:06:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:24.677 "name": "raid_bdev1", 00:25:24.677 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:24.677 "strip_size_kb": 0, 00:25:24.677 "state": "online", 00:25:24.677 "raid_level": "raid1", 00:25:24.677 "superblock": true, 00:25:24.677 "num_base_bdevs": 2, 00:25:24.677 "num_base_bdevs_discovered": 1, 00:25:24.677 "num_base_bdevs_operational": 1, 00:25:24.677 "base_bdevs_list": [ 00:25:24.677 { 00:25:24.677 "name": null, 00:25:24.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:24.677 "is_configured": false, 00:25:24.677 "data_offset": 2048, 00:25:24.677 "data_size": 63488 00:25:24.677 }, 00:25:24.677 { 00:25:24.677 "name": "BaseBdev2", 00:25:24.677 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:24.677 "is_configured": true, 00:25:24.677 "data_offset": 2048, 00:25:24.677 "data_size": 63488 00:25:24.677 } 00:25:24.677 ] 00:25:24.677 }' 00:25:24.677 12:06:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:24.677 12:06:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:24.677 12:06:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:24.677 12:06:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:24.677 12:06:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:24.936 12:06:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:25.195 [2024-07-15 12:06:38.677940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:25.195 [2024-07-15 12:06:38.677994] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:25.195 [2024-07-15 12:06:38.678014] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b56bf0 00:25:25.195 [2024-07-15 12:06:38.678027] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:25.195 [2024-07-15 12:06:38.678383] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:25.195 [2024-07-15 12:06:38.678400] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:25.195 [2024-07-15 12:06:38.678464] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:25.195 [2024-07-15 12:06:38.678475] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:25.195 [2024-07-15 12:06:38.678485] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:25.195 BaseBdev1 00:25:25.195 12:06:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.133 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.393 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.393 "name": "raid_bdev1", 00:25:26.393 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:26.393 "strip_size_kb": 0, 00:25:26.393 "state": "online", 00:25:26.393 "raid_level": "raid1", 00:25:26.393 "superblock": true, 00:25:26.393 "num_base_bdevs": 2, 00:25:26.393 "num_base_bdevs_discovered": 1, 00:25:26.393 "num_base_bdevs_operational": 1, 00:25:26.393 "base_bdevs_list": [ 00:25:26.393 { 00:25:26.393 "name": null, 00:25:26.394 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:26.394 "is_configured": false, 00:25:26.394 "data_offset": 2048, 00:25:26.394 "data_size": 63488 00:25:26.394 }, 00:25:26.394 { 00:25:26.394 "name": "BaseBdev2", 00:25:26.394 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:26.394 "is_configured": true, 00:25:26.394 "data_offset": 2048, 00:25:26.394 "data_size": 63488 00:25:26.394 } 00:25:26.394 ] 00:25:26.394 }' 00:25:26.394 12:06:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.394 12:06:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:27.332 "name": "raid_bdev1", 00:25:27.332 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:27.332 "strip_size_kb": 0, 00:25:27.332 "state": "online", 00:25:27.332 "raid_level": "raid1", 00:25:27.332 "superblock": true, 00:25:27.332 "num_base_bdevs": 2, 00:25:27.332 "num_base_bdevs_discovered": 1, 00:25:27.332 "num_base_bdevs_operational": 1, 00:25:27.332 "base_bdevs_list": [ 00:25:27.332 { 00:25:27.332 "name": null, 00:25:27.332 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.332 "is_configured": false, 00:25:27.332 "data_offset": 2048, 00:25:27.332 "data_size": 63488 00:25:27.332 }, 00:25:27.332 { 00:25:27.332 "name": "BaseBdev2", 00:25:27.332 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:27.332 "is_configured": true, 00:25:27.332 "data_offset": 2048, 00:25:27.332 "data_size": 63488 00:25:27.332 } 00:25:27.332 ] 00:25:27.332 }' 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:27.332 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:27.592 12:06:40 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:27.592 [2024-07-15 12:06:41.180706] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:27.592 [2024-07-15 12:06:41.180838] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:25:27.592 [2024-07-15 12:06:41.180853] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:27.592 request: 00:25:27.592 { 00:25:27.592 "base_bdev": "BaseBdev1", 00:25:27.592 "raid_bdev": "raid_bdev1", 00:25:27.592 "method": "bdev_raid_add_base_bdev", 00:25:27.592 "req_id": 1 00:25:27.592 } 00:25:27.592 Got JSON-RPC error response 00:25:27.592 response: 00:25:27.592 { 00:25:27.592 "code": -22, 00:25:27.592 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:27.592 } 00:25:27.851 12:06:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:25:27.851 12:06:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:25:27.851 12:06:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:25:27.851 12:06:41 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:25:27.851 12:06:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:28.789 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.047 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:29.047 "name": "raid_bdev1", 00:25:29.047 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:29.047 "strip_size_kb": 0, 00:25:29.047 "state": "online", 00:25:29.047 "raid_level": "raid1", 00:25:29.047 "superblock": true, 00:25:29.047 "num_base_bdevs": 2, 00:25:29.047 "num_base_bdevs_discovered": 1, 00:25:29.047 "num_base_bdevs_operational": 1, 00:25:29.047 "base_bdevs_list": [ 00:25:29.047 { 00:25:29.047 "name": null, 00:25:29.047 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.047 "is_configured": false, 00:25:29.047 "data_offset": 2048, 00:25:29.047 "data_size": 63488 00:25:29.047 }, 00:25:29.047 { 00:25:29.047 "name": "BaseBdev2", 00:25:29.047 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:29.047 "is_configured": true, 00:25:29.047 "data_offset": 2048, 00:25:29.047 "data_size": 63488 00:25:29.047 } 00:25:29.047 ] 00:25:29.047 }' 00:25:29.047 12:06:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:29.047 12:06:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:29.616 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:29.616 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.616 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:29.616 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:29.616 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.616 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.616 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.875 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.875 "name": "raid_bdev1", 00:25:29.875 "uuid": "4e9361e7-93b7-4826-b142-d630ce65952d", 00:25:29.875 "strip_size_kb": 0, 00:25:29.875 "state": "online", 00:25:29.875 "raid_level": "raid1", 00:25:29.875 "superblock": true, 00:25:29.876 "num_base_bdevs": 2, 00:25:29.876 "num_base_bdevs_discovered": 1, 00:25:29.876 "num_base_bdevs_operational": 1, 00:25:29.876 "base_bdevs_list": [ 00:25:29.876 { 00:25:29.876 "name": null, 00:25:29.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.876 "is_configured": false, 00:25:29.876 "data_offset": 2048, 00:25:29.876 "data_size": 63488 00:25:29.876 }, 00:25:29.876 { 00:25:29.876 "name": "BaseBdev2", 00:25:29.876 "uuid": "a8024f49-2228-548f-9781-9ed031c939a7", 00:25:29.876 "is_configured": true, 00:25:29.876 "data_offset": 2048, 00:25:29.876 "data_size": 63488 00:25:29.876 } 00:25:29.876 ] 00:25:29.876 }' 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1567916 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1567916 ']' 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1567916 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1567916 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1567916' 00:25:29.876 killing process with pid 1567916 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1567916 00:25:29.876 Received shutdown signal, test time was about 60.000000 seconds 00:25:29.876 00:25:29.876 Latency(us) 00:25:29.876 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:29.876 =================================================================================================================== 00:25:29.876 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:25:29.876 [2024-07-15 12:06:43.460137] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:29.876 [2024-07-15 12:06:43.460230] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:29.876 [2024-07-15 12:06:43.460282] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:29.876 [2024-07-15 12:06:43.460295] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b4fa40 name raid_bdev1, state offline 00:25:29.876 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1567916 00:25:30.135 [2024-07-15 12:06:43.488413] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:30.135 12:06:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:25:30.135 00:25:30.135 real 0m37.242s 00:25:30.135 user 0m53.579s 00:25:30.135 sys 0m7.358s 00:25:30.135 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:30.135 12:06:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:25:30.135 ************************************ 00:25:30.135 END TEST raid_rebuild_test_sb 00:25:30.135 ************************************ 00:25:30.395 12:06:43 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:30.395 12:06:43 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:25:30.395 12:06:43 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:30.395 12:06:43 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:30.395 12:06:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:30.395 ************************************ 00:25:30.395 START TEST raid_rebuild_test_io 00:25:30.395 ************************************ 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 false true true 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1573105 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1573105 /var/tmp/spdk-raid.sock 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1573105 ']' 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:30.395 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:30.395 12:06:43 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:30.395 [2024-07-15 12:06:43.873776] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:25:30.395 [2024-07-15 12:06:43.873844] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1573105 ] 00:25:30.395 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:30.395 Zero copy mechanism will not be used. 00:25:30.655 [2024-07-15 12:06:43.992801] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:30.655 [2024-07-15 12:06:44.099369] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:30.655 [2024-07-15 12:06:44.161044] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:30.655 [2024-07-15 12:06:44.161077] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:31.222 12:06:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:31.222 12:06:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:25:31.222 12:06:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:31.222 12:06:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:31.481 BaseBdev1_malloc 00:25:31.481 12:06:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:31.740 [2024-07-15 12:06:45.269974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:31.740 [2024-07-15 12:06:45.270020] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.740 [2024-07-15 12:06:45.270043] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd0b9c0 00:25:31.740 [2024-07-15 12:06:45.270056] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.740 [2024-07-15 12:06:45.271656] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.740 [2024-07-15 12:06:45.271683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:31.740 BaseBdev1 00:25:31.740 12:06:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:31.740 12:06:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:31.999 BaseBdev2_malloc 00:25:31.999 12:06:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:32.258 [2024-07-15 12:06:45.775977] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:32.258 [2024-07-15 12:06:45.776025] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:32.258 [2024-07-15 12:06:45.776049] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd0c510 00:25:32.258 [2024-07-15 12:06:45.776061] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:32.258 [2024-07-15 12:06:45.777618] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:32.258 [2024-07-15 12:06:45.777649] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:32.258 BaseBdev2 00:25:32.258 12:06:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:32.518 spare_malloc 00:25:32.518 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:32.777 spare_delay 00:25:32.777 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:33.036 [2024-07-15 12:06:46.511700] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:33.036 [2024-07-15 12:06:46.511749] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:33.036 [2024-07-15 12:06:46.511773] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeb3520 00:25:33.036 [2024-07-15 12:06:46.511788] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:33.037 [2024-07-15 12:06:46.513403] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:33.037 [2024-07-15 12:06:46.513434] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:33.037 spare 00:25:33.037 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:33.312 [2024-07-15 12:06:46.744315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:33.312 [2024-07-15 12:06:46.745609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:33.312 [2024-07-15 12:06:46.745693] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xeb4ab0 00:25:33.312 [2024-07-15 12:06:46.745705] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:25:33.312 [2024-07-15 12:06:46.745911] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd0b5c0 00:25:33.312 [2024-07-15 12:06:46.746059] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xeb4ab0 00:25:33.312 [2024-07-15 12:06:46.746069] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xeb4ab0 00:25:33.312 [2024-07-15 12:06:46.746183] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.312 12:06:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.571 12:06:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:33.571 "name": "raid_bdev1", 00:25:33.571 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:33.571 "strip_size_kb": 0, 00:25:33.571 "state": "online", 00:25:33.571 "raid_level": "raid1", 00:25:33.571 "superblock": false, 00:25:33.571 "num_base_bdevs": 2, 00:25:33.571 "num_base_bdevs_discovered": 2, 00:25:33.571 "num_base_bdevs_operational": 2, 00:25:33.571 "base_bdevs_list": [ 00:25:33.571 { 00:25:33.571 "name": "BaseBdev1", 00:25:33.571 "uuid": "b9f0c475-8dad-558d-a1c3-643eede4804b", 00:25:33.571 "is_configured": true, 00:25:33.571 "data_offset": 0, 00:25:33.571 "data_size": 65536 00:25:33.571 }, 00:25:33.571 { 00:25:33.571 "name": "BaseBdev2", 00:25:33.571 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:33.571 "is_configured": true, 00:25:33.571 "data_offset": 0, 00:25:33.571 "data_size": 65536 00:25:33.571 } 00:25:33.571 ] 00:25:33.571 }' 00:25:33.571 12:06:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:33.571 12:06:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:34.140 12:06:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:34.140 12:06:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:34.399 [2024-07-15 12:06:47.827417] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:34.399 12:06:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:25:34.399 12:06:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.399 12:06:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:34.658 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:25:34.658 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:34.658 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:34.658 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:34.658 [2024-07-15 12:06:48.198496] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd030d0 00:25:34.658 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:34.658 Zero copy mechanism will not be used. 00:25:34.658 Running I/O for 60 seconds... 00:25:34.918 [2024-07-15 12:06:48.335332] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:34.918 [2024-07-15 12:06:48.335514] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xd030d0 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.918 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.177 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:35.177 "name": "raid_bdev1", 00:25:35.177 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:35.177 "strip_size_kb": 0, 00:25:35.177 "state": "online", 00:25:35.177 "raid_level": "raid1", 00:25:35.177 "superblock": false, 00:25:35.177 "num_base_bdevs": 2, 00:25:35.177 "num_base_bdevs_discovered": 1, 00:25:35.177 "num_base_bdevs_operational": 1, 00:25:35.177 "base_bdevs_list": [ 00:25:35.177 { 00:25:35.177 "name": null, 00:25:35.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.177 "is_configured": false, 00:25:35.177 "data_offset": 0, 00:25:35.177 "data_size": 65536 00:25:35.177 }, 00:25:35.177 { 00:25:35.177 "name": "BaseBdev2", 00:25:35.177 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:35.177 "is_configured": true, 00:25:35.177 "data_offset": 0, 00:25:35.177 "data_size": 65536 00:25:35.177 } 00:25:35.177 ] 00:25:35.177 }' 00:25:35.177 12:06:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:35.178 12:06:48 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:35.746 12:06:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:36.005 [2024-07-15 12:06:49.492473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:36.005 12:06:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:36.005 [2024-07-15 12:06:49.535599] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa13230 00:25:36.005 [2024-07-15 12:06:49.538137] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:36.264 [2024-07-15 12:06:49.655901] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:36.264 [2024-07-15 12:06:49.656287] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:36.523 [2024-07-15 12:06:49.866937] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:36.523 [2024-07-15 12:06:49.867111] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:36.782 [2024-07-15 12:06:50.123100] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:36.782 [2024-07-15 12:06:50.342518] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:36.782 [2024-07-15 12:06:50.342767] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:37.041 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:37.041 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:37.042 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:37.042 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:37.042 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:37.042 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.042 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:37.301 [2024-07-15 12:06:50.734319] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:37.301 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:37.301 "name": "raid_bdev1", 00:25:37.301 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:37.301 "strip_size_kb": 0, 00:25:37.301 "state": "online", 00:25:37.301 "raid_level": "raid1", 00:25:37.301 "superblock": false, 00:25:37.301 "num_base_bdevs": 2, 00:25:37.301 "num_base_bdevs_discovered": 2, 00:25:37.301 "num_base_bdevs_operational": 2, 00:25:37.301 "process": { 00:25:37.301 "type": "rebuild", 00:25:37.301 "target": "spare", 00:25:37.301 "progress": { 00:25:37.301 "blocks": 14336, 00:25:37.301 "percent": 21 00:25:37.301 } 00:25:37.301 }, 00:25:37.301 "base_bdevs_list": [ 00:25:37.301 { 00:25:37.301 "name": "spare", 00:25:37.301 "uuid": "64979012-6319-5dd3-8d75-5616f7125374", 00:25:37.301 "is_configured": true, 00:25:37.301 "data_offset": 0, 00:25:37.301 "data_size": 65536 00:25:37.301 }, 00:25:37.301 { 00:25:37.301 "name": "BaseBdev2", 00:25:37.301 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:37.301 "is_configured": true, 00:25:37.301 "data_offset": 0, 00:25:37.301 "data_size": 65536 00:25:37.301 } 00:25:37.301 ] 00:25:37.301 }' 00:25:37.301 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:37.301 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:37.301 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:37.301 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:37.301 12:06:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:37.561 [2024-07-15 12:06:50.954950] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:37.561 [2024-07-15 12:06:50.955220] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:37.561 [2024-07-15 12:06:51.101461] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:37.821 [2024-07-15 12:06:51.319590] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:37.821 [2024-07-15 12:06:51.329282] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:37.821 [2024-07-15 12:06:51.329308] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:37.821 [2024-07-15 12:06:51.329319] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:37.821 [2024-07-15 12:06:51.351548] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xd030d0 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:37.821 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:38.080 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:38.080 "name": "raid_bdev1", 00:25:38.080 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:38.080 "strip_size_kb": 0, 00:25:38.080 "state": "online", 00:25:38.080 "raid_level": "raid1", 00:25:38.080 "superblock": false, 00:25:38.080 "num_base_bdevs": 2, 00:25:38.080 "num_base_bdevs_discovered": 1, 00:25:38.080 "num_base_bdevs_operational": 1, 00:25:38.080 "base_bdevs_list": [ 00:25:38.080 { 00:25:38.080 "name": null, 00:25:38.080 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.080 "is_configured": false, 00:25:38.080 "data_offset": 0, 00:25:38.080 "data_size": 65536 00:25:38.080 }, 00:25:38.080 { 00:25:38.080 "name": "BaseBdev2", 00:25:38.080 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:38.080 "is_configured": true, 00:25:38.080 "data_offset": 0, 00:25:38.080 "data_size": 65536 00:25:38.080 } 00:25:38.080 ] 00:25:38.080 }' 00:25:38.080 12:06:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:38.080 12:06:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:39.018 "name": "raid_bdev1", 00:25:39.018 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:39.018 "strip_size_kb": 0, 00:25:39.018 "state": "online", 00:25:39.018 "raid_level": "raid1", 00:25:39.018 "superblock": false, 00:25:39.018 "num_base_bdevs": 2, 00:25:39.018 "num_base_bdevs_discovered": 1, 00:25:39.018 "num_base_bdevs_operational": 1, 00:25:39.018 "base_bdevs_list": [ 00:25:39.018 { 00:25:39.018 "name": null, 00:25:39.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.018 "is_configured": false, 00:25:39.018 "data_offset": 0, 00:25:39.018 "data_size": 65536 00:25:39.018 }, 00:25:39.018 { 00:25:39.018 "name": "BaseBdev2", 00:25:39.018 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:39.018 "is_configured": true, 00:25:39.018 "data_offset": 0, 00:25:39.018 "data_size": 65536 00:25:39.018 } 00:25:39.018 ] 00:25:39.018 }' 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:39.018 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:39.278 [2024-07-15 12:06:52.828466] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:39.535 12:06:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:39.535 [2024-07-15 12:06:52.904751] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd32460 00:25:39.535 [2024-07-15 12:06:52.906250] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:39.535 [2024-07-15 12:06:53.016238] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:39.535 [2024-07-15 12:06:53.016733] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:39.793 [2024-07-15 12:06:53.145002] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:39.793 [2024-07-15 12:06:53.145178] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:40.053 [2024-07-15 12:06:53.466912] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:40.053 [2024-07-15 12:06:53.467229] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:40.053 [2024-07-15 12:06:53.595529] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:40.311 12:06:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:40.311 12:06:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:40.311 12:06:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:40.311 12:06:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:40.311 12:06:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:40.614 12:06:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.614 12:06:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.614 [2024-07-15 12:06:53.934815] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:40.614 [2024-07-15 12:06:54.044178] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:40.614 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:40.614 "name": "raid_bdev1", 00:25:40.614 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:40.614 "strip_size_kb": 0, 00:25:40.614 "state": "online", 00:25:40.614 "raid_level": "raid1", 00:25:40.614 "superblock": false, 00:25:40.614 "num_base_bdevs": 2, 00:25:40.614 "num_base_bdevs_discovered": 2, 00:25:40.614 "num_base_bdevs_operational": 2, 00:25:40.614 "process": { 00:25:40.614 "type": "rebuild", 00:25:40.614 "target": "spare", 00:25:40.614 "progress": { 00:25:40.614 "blocks": 16384, 00:25:40.614 "percent": 25 00:25:40.614 } 00:25:40.614 }, 00:25:40.614 "base_bdevs_list": [ 00:25:40.614 { 00:25:40.614 "name": "spare", 00:25:40.614 "uuid": "64979012-6319-5dd3-8d75-5616f7125374", 00:25:40.614 "is_configured": true, 00:25:40.614 "data_offset": 0, 00:25:40.614 "data_size": 65536 00:25:40.614 }, 00:25:40.614 { 00:25:40.614 "name": "BaseBdev2", 00:25:40.614 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:40.614 "is_configured": true, 00:25:40.614 "data_offset": 0, 00:25:40.615 "data_size": 65536 00:25:40.615 } 00:25:40.615 ] 00:25:40.615 }' 00:25:40.615 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:40.615 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:40.615 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=848 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.893 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:40.893 [2024-07-15 12:06:54.383570] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:41.152 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:41.152 "name": "raid_bdev1", 00:25:41.152 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:41.152 "strip_size_kb": 0, 00:25:41.152 "state": "online", 00:25:41.152 "raid_level": "raid1", 00:25:41.152 "superblock": false, 00:25:41.152 "num_base_bdevs": 2, 00:25:41.152 "num_base_bdevs_discovered": 2, 00:25:41.152 "num_base_bdevs_operational": 2, 00:25:41.152 "process": { 00:25:41.152 "type": "rebuild", 00:25:41.152 "target": "spare", 00:25:41.152 "progress": { 00:25:41.152 "blocks": 20480, 00:25:41.152 "percent": 31 00:25:41.152 } 00:25:41.152 }, 00:25:41.152 "base_bdevs_list": [ 00:25:41.152 { 00:25:41.152 "name": "spare", 00:25:41.152 "uuid": "64979012-6319-5dd3-8d75-5616f7125374", 00:25:41.152 "is_configured": true, 00:25:41.152 "data_offset": 0, 00:25:41.152 "data_size": 65536 00:25:41.152 }, 00:25:41.152 { 00:25:41.152 "name": "BaseBdev2", 00:25:41.152 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:41.152 "is_configured": true, 00:25:41.152 "data_offset": 0, 00:25:41.152 "data_size": 65536 00:25:41.152 } 00:25:41.152 ] 00:25:41.152 }' 00:25:41.152 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:41.152 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:41.152 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:41.152 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:41.152 12:06:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:42.086 [2024-07-15 12:06:55.510426] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:42.086 12:06:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:42.086 12:06:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:42.086 12:06:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:42.086 12:06:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:42.086 12:06:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:42.086 12:06:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:42.086 12:06:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:42.086 12:06:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.650 12:06:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:42.650 "name": "raid_bdev1", 00:25:42.650 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:42.650 "strip_size_kb": 0, 00:25:42.650 "state": "online", 00:25:42.650 "raid_level": "raid1", 00:25:42.650 "superblock": false, 00:25:42.650 "num_base_bdevs": 2, 00:25:42.650 "num_base_bdevs_discovered": 2, 00:25:42.650 "num_base_bdevs_operational": 2, 00:25:42.650 "process": { 00:25:42.650 "type": "rebuild", 00:25:42.650 "target": "spare", 00:25:42.650 "progress": { 00:25:42.650 "blocks": 49152, 00:25:42.650 "percent": 75 00:25:42.650 } 00:25:42.650 }, 00:25:42.650 "base_bdevs_list": [ 00:25:42.650 { 00:25:42.650 "name": "spare", 00:25:42.650 "uuid": "64979012-6319-5dd3-8d75-5616f7125374", 00:25:42.650 "is_configured": true, 00:25:42.650 "data_offset": 0, 00:25:42.650 "data_size": 65536 00:25:42.651 }, 00:25:42.651 { 00:25:42.651 "name": "BaseBdev2", 00:25:42.651 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:42.651 "is_configured": true, 00:25:42.651 "data_offset": 0, 00:25:42.651 "data_size": 65536 00:25:42.651 } 00:25:42.651 ] 00:25:42.651 }' 00:25:42.651 12:06:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:42.651 12:06:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:42.651 12:06:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:42.909 12:06:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:42.909 12:06:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:43.475 [2024-07-15 12:06:57.030393] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:43.733 [2024-07-15 12:06:57.138635] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:43.733 [2024-07-15 12:06:57.140364] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:43.733 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:43.733 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:43.734 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:43.734 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:43.734 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:43.734 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:43.734 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.734 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:43.992 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:43.992 "name": "raid_bdev1", 00:25:43.992 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:43.992 "strip_size_kb": 0, 00:25:43.992 "state": "online", 00:25:43.992 "raid_level": "raid1", 00:25:43.992 "superblock": false, 00:25:43.992 "num_base_bdevs": 2, 00:25:43.992 "num_base_bdevs_discovered": 2, 00:25:43.992 "num_base_bdevs_operational": 2, 00:25:43.992 "base_bdevs_list": [ 00:25:43.992 { 00:25:43.992 "name": "spare", 00:25:43.992 "uuid": "64979012-6319-5dd3-8d75-5616f7125374", 00:25:43.992 "is_configured": true, 00:25:43.992 "data_offset": 0, 00:25:43.992 "data_size": 65536 00:25:43.992 }, 00:25:43.992 { 00:25:43.992 "name": "BaseBdev2", 00:25:43.992 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:43.992 "is_configured": true, 00:25:43.992 "data_offset": 0, 00:25:43.992 "data_size": 65536 00:25:43.992 } 00:25:43.992 ] 00:25:43.992 }' 00:25:43.992 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:43.992 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:43.992 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:44.249 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:44.249 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:25:44.249 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:44.249 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:44.249 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:44.249 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:44.249 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:44.249 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.249 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:44.508 "name": "raid_bdev1", 00:25:44.508 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:44.508 "strip_size_kb": 0, 00:25:44.508 "state": "online", 00:25:44.508 "raid_level": "raid1", 00:25:44.508 "superblock": false, 00:25:44.508 "num_base_bdevs": 2, 00:25:44.508 "num_base_bdevs_discovered": 2, 00:25:44.508 "num_base_bdevs_operational": 2, 00:25:44.508 "base_bdevs_list": [ 00:25:44.508 { 00:25:44.508 "name": "spare", 00:25:44.508 "uuid": "64979012-6319-5dd3-8d75-5616f7125374", 00:25:44.508 "is_configured": true, 00:25:44.508 "data_offset": 0, 00:25:44.508 "data_size": 65536 00:25:44.508 }, 00:25:44.508 { 00:25:44.508 "name": "BaseBdev2", 00:25:44.508 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:44.508 "is_configured": true, 00:25:44.508 "data_offset": 0, 00:25:44.508 "data_size": 65536 00:25:44.508 } 00:25:44.508 ] 00:25:44.508 }' 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.508 12:06:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.767 12:06:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:44.767 "name": "raid_bdev1", 00:25:44.767 "uuid": "4ae83487-80f4-4d6a-aa18-a98b5d38ba6e", 00:25:44.767 "strip_size_kb": 0, 00:25:44.767 "state": "online", 00:25:44.767 "raid_level": "raid1", 00:25:44.767 "superblock": false, 00:25:44.767 "num_base_bdevs": 2, 00:25:44.767 "num_base_bdevs_discovered": 2, 00:25:44.767 "num_base_bdevs_operational": 2, 00:25:44.767 "base_bdevs_list": [ 00:25:44.767 { 00:25:44.767 "name": "spare", 00:25:44.767 "uuid": "64979012-6319-5dd3-8d75-5616f7125374", 00:25:44.767 "is_configured": true, 00:25:44.767 "data_offset": 0, 00:25:44.767 "data_size": 65536 00:25:44.767 }, 00:25:44.767 { 00:25:44.767 "name": "BaseBdev2", 00:25:44.767 "uuid": "69f4ab36-8fa6-5705-a416-2028832e763a", 00:25:44.767 "is_configured": true, 00:25:44.767 "data_offset": 0, 00:25:44.767 "data_size": 65536 00:25:44.767 } 00:25:44.767 ] 00:25:44.767 }' 00:25:44.767 12:06:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:44.767 12:06:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:45.335 12:06:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:45.595 [2024-07-15 12:06:59.074527] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:45.595 [2024-07-15 12:06:59.074559] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:45.595 00:25:45.595 Latency(us) 00:25:45.595 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:45.595 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:45.595 raid_bdev1 : 10.90 97.01 291.04 0.00 0.00 13836.31 293.84 118534.68 00:25:45.595 =================================================================================================================== 00:25:45.595 Total : 97.01 291.04 0.00 0.00 13836.31 293.84 118534.68 00:25:45.595 [2024-07-15 12:06:59.126616] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:45.595 [2024-07-15 12:06:59.126645] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:45.595 [2024-07-15 12:06:59.126722] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:45.595 [2024-07-15 12:06:59.126735] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xeb4ab0 name raid_bdev1, state offline 00:25:45.595 0 00:25:45.595 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.596 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:45.854 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:45.855 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:46.113 /dev/nbd0 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:46.113 1+0 records in 00:25:46.113 1+0 records out 00:25:46.113 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000258253 s, 15.9 MB/s 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:25:46.113 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:46.114 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:46.114 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:46.114 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:46.114 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:46.114 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.114 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:25:46.372 /dev/nbd1 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:46.372 1+0 records in 00:25:46.372 1+0 records out 00:25:46.372 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257411 s, 15.9 MB/s 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:25:46.372 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:46.630 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:25:46.630 12:06:59 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:25:46.630 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:46.630 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:46.630 12:06:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:46.630 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:46.630 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.630 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:46.630 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:46.630 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:46.630 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:46.630 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:46.888 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:46.888 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:46.888 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:46.888 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:46.889 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1573105 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1573105 ']' 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1573105 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1573105 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1573105' 00:25:47.148 killing process with pid 1573105 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1573105 00:25:47.148 Received shutdown signal, test time was about 12.424984 seconds 00:25:47.148 00:25:47.148 Latency(us) 00:25:47.148 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:47.148 =================================================================================================================== 00:25:47.148 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:47.148 [2024-07-15 12:07:00.656062] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:47.148 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1573105 00:25:47.148 [2024-07-15 12:07:00.678244] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:47.407 12:07:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:47.407 00:25:47.407 real 0m17.100s 00:25:47.407 user 0m26.564s 00:25:47.407 sys 0m2.838s 00:25:47.407 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:25:47.407 12:07:00 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:47.407 ************************************ 00:25:47.407 END TEST raid_rebuild_test_io 00:25:47.407 ************************************ 00:25:47.407 12:07:00 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:25:47.407 12:07:00 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:25:47.407 12:07:00 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:25:47.407 12:07:00 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:25:47.407 12:07:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:47.407 ************************************ 00:25:47.407 START TEST raid_rebuild_test_sb_io 00:25:47.407 ************************************ 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true true true 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:47.408 12:07:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:47.408 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:47.408 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:47.408 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:47.408 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:47.408 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:47.408 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1575446 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1575446 /var/tmp/spdk-raid.sock 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1575446 ']' 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:47.667 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:25:47.667 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:47.667 [2024-07-15 12:07:01.066314] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:25:47.667 [2024-07-15 12:07:01.066379] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1575446 ] 00:25:47.667 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:47.667 Zero copy mechanism will not be used. 00:25:47.667 [2024-07-15 12:07:01.198666] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:47.927 [2024-07-15 12:07:01.299776] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:47.927 [2024-07-15 12:07:01.375367] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:47.927 [2024-07-15 12:07:01.375397] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:48.496 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:25:48.496 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:25:48.496 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:48.496 12:07:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:48.754 BaseBdev1_malloc 00:25:48.754 12:07:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:49.013 [2024-07-15 12:07:02.477901] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:49.013 [2024-07-15 12:07:02.477954] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:49.013 [2024-07-15 12:07:02.477975] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd49c0 00:25:49.013 [2024-07-15 12:07:02.477988] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:49.013 [2024-07-15 12:07:02.479547] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:49.013 [2024-07-15 12:07:02.479576] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:49.013 BaseBdev1 00:25:49.013 12:07:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:49.013 12:07:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:49.271 BaseBdev2_malloc 00:25:49.271 12:07:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:49.530 [2024-07-15 12:07:02.975859] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:49.530 [2024-07-15 12:07:02.975903] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:49.530 [2024-07-15 12:07:02.975925] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd5510 00:25:49.530 [2024-07-15 12:07:02.975938] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:49.530 [2024-07-15 12:07:02.977295] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:49.530 [2024-07-15 12:07:02.977323] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:49.530 BaseBdev2 00:25:49.530 12:07:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:49.796 spare_malloc 00:25:49.796 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:50.056 spare_delay 00:25:50.056 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:50.316 [2024-07-15 12:07:03.658133] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:50.316 [2024-07-15 12:07:03.658178] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:50.316 [2024-07-15 12:07:03.658197] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe7c520 00:25:50.316 [2024-07-15 12:07:03.658210] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:50.316 [2024-07-15 12:07:03.659654] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:50.316 [2024-07-15 12:07:03.659691] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:50.316 spare 00:25:50.316 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:50.316 [2024-07-15 12:07:03.906816] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:50.316 [2024-07-15 12:07:03.908045] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:50.316 [2024-07-15 12:07:03.908200] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe7dab0 00:25:50.316 [2024-07-15 12:07:03.908213] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:50.316 [2024-07-15 12:07:03.908399] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcd45c0 00:25:50.316 [2024-07-15 12:07:03.908539] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe7dab0 00:25:50.316 [2024-07-15 12:07:03.908549] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe7dab0 00:25:50.316 [2024-07-15 12:07:03.908648] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.575 12:07:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.835 12:07:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:50.835 "name": "raid_bdev1", 00:25:50.835 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:25:50.835 "strip_size_kb": 0, 00:25:50.835 "state": "online", 00:25:50.835 "raid_level": "raid1", 00:25:50.835 "superblock": true, 00:25:50.835 "num_base_bdevs": 2, 00:25:50.835 "num_base_bdevs_discovered": 2, 00:25:50.835 "num_base_bdevs_operational": 2, 00:25:50.835 "base_bdevs_list": [ 00:25:50.835 { 00:25:50.835 "name": "BaseBdev1", 00:25:50.835 "uuid": "1f01c031-fd53-5d5b-b807-5c0ef1c8e64f", 00:25:50.835 "is_configured": true, 00:25:50.835 "data_offset": 2048, 00:25:50.835 "data_size": 63488 00:25:50.835 }, 00:25:50.835 { 00:25:50.835 "name": "BaseBdev2", 00:25:50.835 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:25:50.835 "is_configured": true, 00:25:50.835 "data_offset": 2048, 00:25:50.835 "data_size": 63488 00:25:50.835 } 00:25:50.835 ] 00:25:50.835 }' 00:25:50.835 12:07:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:50.835 12:07:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:51.402 12:07:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:51.402 12:07:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:51.661 [2024-07-15 12:07:05.009959] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:51.661 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:25:51.661 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.661 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:51.919 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:25:51.919 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:25:51.919 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:51.919 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:51.919 [2024-07-15 12:07:05.392848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe79a40 00:25:51.919 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:51.919 Zero copy mechanism will not be used. 00:25:51.919 Running I/O for 60 seconds... 00:25:52.178 [2024-07-15 12:07:05.519257] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:52.178 [2024-07-15 12:07:05.527483] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe79a40 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.178 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:52.436 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:52.436 "name": "raid_bdev1", 00:25:52.436 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:25:52.436 "strip_size_kb": 0, 00:25:52.436 "state": "online", 00:25:52.436 "raid_level": "raid1", 00:25:52.436 "superblock": true, 00:25:52.436 "num_base_bdevs": 2, 00:25:52.436 "num_base_bdevs_discovered": 1, 00:25:52.436 "num_base_bdevs_operational": 1, 00:25:52.436 "base_bdevs_list": [ 00:25:52.436 { 00:25:52.436 "name": null, 00:25:52.436 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:52.436 "is_configured": false, 00:25:52.436 "data_offset": 2048, 00:25:52.436 "data_size": 63488 00:25:52.436 }, 00:25:52.436 { 00:25:52.436 "name": "BaseBdev2", 00:25:52.436 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:25:52.436 "is_configured": true, 00:25:52.436 "data_offset": 2048, 00:25:52.436 "data_size": 63488 00:25:52.436 } 00:25:52.436 ] 00:25:52.436 }' 00:25:52.436 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:52.436 12:07:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:53.003 12:07:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:53.261 [2024-07-15 12:07:06.646711] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:53.261 12:07:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:53.261 [2024-07-15 12:07:06.730174] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xccc600 00:25:53.261 [2024-07-15 12:07:06.732542] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:53.261 [2024-07-15 12:07:06.842272] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:53.261 [2024-07-15 12:07:06.842787] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:53.520 [2024-07-15 12:07:07.078856] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:53.520 [2024-07-15 12:07:07.079102] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:54.088 [2024-07-15 12:07:07.392782] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:54.088 [2024-07-15 12:07:07.393158] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:25:54.088 [2024-07-15 12:07:07.629721] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:54.346 12:07:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:54.346 12:07:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:54.346 12:07:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:54.346 12:07:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:54.346 12:07:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:54.346 12:07:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.346 12:07:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.605 [2024-07-15 12:07:07.962193] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:54.605 12:07:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:54.605 "name": "raid_bdev1", 00:25:54.605 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:25:54.605 "strip_size_kb": 0, 00:25:54.605 "state": "online", 00:25:54.605 "raid_level": "raid1", 00:25:54.605 "superblock": true, 00:25:54.605 "num_base_bdevs": 2, 00:25:54.605 "num_base_bdevs_discovered": 2, 00:25:54.605 "num_base_bdevs_operational": 2, 00:25:54.605 "process": { 00:25:54.605 "type": "rebuild", 00:25:54.605 "target": "spare", 00:25:54.605 "progress": { 00:25:54.605 "blocks": 12288, 00:25:54.605 "percent": 19 00:25:54.605 } 00:25:54.605 }, 00:25:54.605 "base_bdevs_list": [ 00:25:54.605 { 00:25:54.605 "name": "spare", 00:25:54.605 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:25:54.605 "is_configured": true, 00:25:54.605 "data_offset": 2048, 00:25:54.605 "data_size": 63488 00:25:54.605 }, 00:25:54.605 { 00:25:54.605 "name": "BaseBdev2", 00:25:54.605 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:25:54.605 "is_configured": true, 00:25:54.605 "data_offset": 2048, 00:25:54.605 "data_size": 63488 00:25:54.605 } 00:25:54.605 ] 00:25:54.605 }' 00:25:54.605 12:07:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:54.605 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:54.605 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:54.605 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:54.605 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:54.863 [2024-07-15 12:07:08.224232] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:55.121 [2024-07-15 12:07:08.616996] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:55.380 [2024-07-15 12:07:08.799348] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:55.380 [2024-07-15 12:07:08.808998] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:55.380 [2024-07-15 12:07:08.809026] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:55.380 [2024-07-15 12:07:08.809037] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:55.380 [2024-07-15 12:07:08.830754] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xe79a40 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.380 12:07:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.948 12:07:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.948 "name": "raid_bdev1", 00:25:55.948 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:25:55.948 "strip_size_kb": 0, 00:25:55.948 "state": "online", 00:25:55.948 "raid_level": "raid1", 00:25:55.948 "superblock": true, 00:25:55.948 "num_base_bdevs": 2, 00:25:55.948 "num_base_bdevs_discovered": 1, 00:25:55.948 "num_base_bdevs_operational": 1, 00:25:55.948 "base_bdevs_list": [ 00:25:55.948 { 00:25:55.948 "name": null, 00:25:55.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.948 "is_configured": false, 00:25:55.948 "data_offset": 2048, 00:25:55.948 "data_size": 63488 00:25:55.948 }, 00:25:55.948 { 00:25:55.948 "name": "BaseBdev2", 00:25:55.948 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:25:55.948 "is_configured": true, 00:25:55.948 "data_offset": 2048, 00:25:55.948 "data_size": 63488 00:25:55.948 } 00:25:55.948 ] 00:25:55.948 }' 00:25:55.948 12:07:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.948 12:07:09 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:56.885 "name": "raid_bdev1", 00:25:56.885 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:25:56.885 "strip_size_kb": 0, 00:25:56.885 "state": "online", 00:25:56.885 "raid_level": "raid1", 00:25:56.885 "superblock": true, 00:25:56.885 "num_base_bdevs": 2, 00:25:56.885 "num_base_bdevs_discovered": 1, 00:25:56.885 "num_base_bdevs_operational": 1, 00:25:56.885 "base_bdevs_list": [ 00:25:56.885 { 00:25:56.885 "name": null, 00:25:56.885 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.885 "is_configured": false, 00:25:56.885 "data_offset": 2048, 00:25:56.885 "data_size": 63488 00:25:56.885 }, 00:25:56.885 { 00:25:56.885 "name": "BaseBdev2", 00:25:56.885 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:25:56.885 "is_configured": true, 00:25:56.885 "data_offset": 2048, 00:25:56.885 "data_size": 63488 00:25:56.885 } 00:25:56.885 ] 00:25:56.885 }' 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:56.885 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:57.143 [2024-07-15 12:07:10.684724] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:57.402 [2024-07-15 12:07:10.743507] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcd4150 00:25:57.402 12:07:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:57.402 [2024-07-15 12:07:10.744993] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:57.402 [2024-07-15 12:07:10.880094] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:57.402 [2024-07-15 12:07:10.880569] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:57.662 [2024-07-15 12:07:11.015641] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:57.662 [2024-07-15 12:07:11.015812] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:57.921 [2024-07-15 12:07:11.515263] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:58.180 12:07:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:58.180 12:07:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:58.180 12:07:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:58.180 12:07:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:58.180 12:07:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:58.180 [2024-07-15 12:07:11.753987] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:58.180 12:07:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.180 12:07:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.439 [2024-07-15 12:07:11.880781] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:58.439 [2024-07-15 12:07:11.880992] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:58.439 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:58.439 "name": "raid_bdev1", 00:25:58.439 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:25:58.439 "strip_size_kb": 0, 00:25:58.439 "state": "online", 00:25:58.439 "raid_level": "raid1", 00:25:58.439 "superblock": true, 00:25:58.439 "num_base_bdevs": 2, 00:25:58.439 "num_base_bdevs_discovered": 2, 00:25:58.439 "num_base_bdevs_operational": 2, 00:25:58.439 "process": { 00:25:58.439 "type": "rebuild", 00:25:58.439 "target": "spare", 00:25:58.439 "progress": { 00:25:58.439 "blocks": 16384, 00:25:58.439 "percent": 25 00:25:58.439 } 00:25:58.439 }, 00:25:58.439 "base_bdevs_list": [ 00:25:58.439 { 00:25:58.439 "name": "spare", 00:25:58.439 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:25:58.439 "is_configured": true, 00:25:58.439 "data_offset": 2048, 00:25:58.439 "data_size": 63488 00:25:58.439 }, 00:25:58.439 { 00:25:58.439 "name": "BaseBdev2", 00:25:58.439 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:25:58.439 "is_configured": true, 00:25:58.439 "data_offset": 2048, 00:25:58.439 "data_size": 63488 00:25:58.439 } 00:25:58.439 ] 00:25:58.439 }' 00:25:58.439 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:58.699 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=866 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.699 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.699 [2024-07-15 12:07:12.202356] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:58.958 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:58.958 "name": "raid_bdev1", 00:25:58.958 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:25:58.958 "strip_size_kb": 0, 00:25:58.958 "state": "online", 00:25:58.958 "raid_level": "raid1", 00:25:58.958 "superblock": true, 00:25:58.958 "num_base_bdevs": 2, 00:25:58.958 "num_base_bdevs_discovered": 2, 00:25:58.958 "num_base_bdevs_operational": 2, 00:25:58.958 "process": { 00:25:58.958 "type": "rebuild", 00:25:58.958 "target": "spare", 00:25:58.958 "progress": { 00:25:58.958 "blocks": 22528, 00:25:58.958 "percent": 35 00:25:58.958 } 00:25:58.958 }, 00:25:58.958 "base_bdevs_list": [ 00:25:58.958 { 00:25:58.958 "name": "spare", 00:25:58.958 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:25:58.958 "is_configured": true, 00:25:58.958 "data_offset": 2048, 00:25:58.958 "data_size": 63488 00:25:58.958 }, 00:25:58.958 { 00:25:58.958 "name": "BaseBdev2", 00:25:58.958 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:25:58.958 "is_configured": true, 00:25:58.959 "data_offset": 2048, 00:25:58.959 "data_size": 63488 00:25:58.959 } 00:25:58.959 ] 00:25:58.959 }' 00:25:58.959 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:58.959 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:58.959 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:58.959 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:58.959 12:07:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:59.217 [2024-07-15 12:07:12.615888] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:59.218 [2024-07-15 12:07:12.616141] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:59.478 [2024-07-15 12:07:12.966078] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:25:59.741 [2024-07-15 12:07:13.086386] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:26:00.000 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:00.000 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:00.000 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:00.000 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:00.000 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:00.000 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:00.000 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.000 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.260 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:00.260 "name": "raid_bdev1", 00:26:00.260 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:00.260 "strip_size_kb": 0, 00:26:00.260 "state": "online", 00:26:00.260 "raid_level": "raid1", 00:26:00.260 "superblock": true, 00:26:00.260 "num_base_bdevs": 2, 00:26:00.260 "num_base_bdevs_discovered": 2, 00:26:00.260 "num_base_bdevs_operational": 2, 00:26:00.260 "process": { 00:26:00.260 "type": "rebuild", 00:26:00.260 "target": "spare", 00:26:00.260 "progress": { 00:26:00.260 "blocks": 45056, 00:26:00.260 "percent": 70 00:26:00.260 } 00:26:00.260 }, 00:26:00.260 "base_bdevs_list": [ 00:26:00.260 { 00:26:00.260 "name": "spare", 00:26:00.260 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:26:00.260 "is_configured": true, 00:26:00.260 "data_offset": 2048, 00:26:00.260 "data_size": 63488 00:26:00.260 }, 00:26:00.260 { 00:26:00.260 "name": "BaseBdev2", 00:26:00.260 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:00.260 "is_configured": true, 00:26:00.260 "data_offset": 2048, 00:26:00.260 "data_size": 63488 00:26:00.260 } 00:26:00.260 ] 00:26:00.260 }' 00:26:00.260 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.260 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:00.260 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.260 [2024-07-15 12:07:13.808003] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:26:00.260 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:00.260 12:07:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:00.828 [2024-07-15 12:07:14.127825] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:26:00.828 [2024-07-15 12:07:14.347496] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:26:01.397 12:07:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:01.397 12:07:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:01.397 12:07:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:01.397 12:07:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:01.397 12:07:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:01.397 12:07:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:01.397 12:07:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.397 12:07:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.759 [2024-07-15 12:07:15.016475] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:01.759 12:07:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:01.759 "name": "raid_bdev1", 00:26:01.759 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:01.759 "strip_size_kb": 0, 00:26:01.759 "state": "online", 00:26:01.759 "raid_level": "raid1", 00:26:01.759 "superblock": true, 00:26:01.759 "num_base_bdevs": 2, 00:26:01.759 "num_base_bdevs_discovered": 2, 00:26:01.759 "num_base_bdevs_operational": 2, 00:26:01.759 "process": { 00:26:01.759 "type": "rebuild", 00:26:01.759 "target": "spare", 00:26:01.759 "progress": { 00:26:01.759 "blocks": 63488, 00:26:01.759 "percent": 100 00:26:01.759 } 00:26:01.759 }, 00:26:01.759 "base_bdevs_list": [ 00:26:01.759 { 00:26:01.759 "name": "spare", 00:26:01.759 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:26:01.759 "is_configured": true, 00:26:01.759 "data_offset": 2048, 00:26:01.759 "data_size": 63488 00:26:01.759 }, 00:26:01.759 { 00:26:01.759 "name": "BaseBdev2", 00:26:01.759 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:01.759 "is_configured": true, 00:26:01.759 "data_offset": 2048, 00:26:01.759 "data_size": 63488 00:26:01.759 } 00:26:01.759 ] 00:26:01.759 }' 00:26:01.759 12:07:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:01.759 12:07:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:01.759 12:07:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:01.759 [2024-07-15 12:07:15.125023] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:01.759 [2024-07-15 12:07:15.127491] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:01.759 12:07:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:01.759 12:07:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:02.697 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:02.697 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:02.697 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:02.697 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:02.697 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:02.697 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:02.697 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.697 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:02.955 "name": "raid_bdev1", 00:26:02.955 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:02.955 "strip_size_kb": 0, 00:26:02.955 "state": "online", 00:26:02.955 "raid_level": "raid1", 00:26:02.955 "superblock": true, 00:26:02.955 "num_base_bdevs": 2, 00:26:02.955 "num_base_bdevs_discovered": 2, 00:26:02.955 "num_base_bdevs_operational": 2, 00:26:02.955 "base_bdevs_list": [ 00:26:02.955 { 00:26:02.955 "name": "spare", 00:26:02.955 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:26:02.955 "is_configured": true, 00:26:02.955 "data_offset": 2048, 00:26:02.955 "data_size": 63488 00:26:02.955 }, 00:26:02.955 { 00:26:02.955 "name": "BaseBdev2", 00:26:02.955 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:02.955 "is_configured": true, 00:26:02.955 "data_offset": 2048, 00:26:02.955 "data_size": 63488 00:26:02.955 } 00:26:02.955 ] 00:26:02.955 }' 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.955 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.213 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:03.213 "name": "raid_bdev1", 00:26:03.213 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:03.213 "strip_size_kb": 0, 00:26:03.213 "state": "online", 00:26:03.213 "raid_level": "raid1", 00:26:03.213 "superblock": true, 00:26:03.213 "num_base_bdevs": 2, 00:26:03.213 "num_base_bdevs_discovered": 2, 00:26:03.213 "num_base_bdevs_operational": 2, 00:26:03.213 "base_bdevs_list": [ 00:26:03.213 { 00:26:03.213 "name": "spare", 00:26:03.213 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:26:03.213 "is_configured": true, 00:26:03.213 "data_offset": 2048, 00:26:03.213 "data_size": 63488 00:26:03.213 }, 00:26:03.213 { 00:26:03.213 "name": "BaseBdev2", 00:26:03.213 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:03.213 "is_configured": true, 00:26:03.213 "data_offset": 2048, 00:26:03.213 "data_size": 63488 00:26:03.213 } 00:26:03.213 ] 00:26:03.213 }' 00:26:03.213 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.472 12:07:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.731 12:07:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:03.731 "name": "raid_bdev1", 00:26:03.731 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:03.731 "strip_size_kb": 0, 00:26:03.731 "state": "online", 00:26:03.731 "raid_level": "raid1", 00:26:03.731 "superblock": true, 00:26:03.731 "num_base_bdevs": 2, 00:26:03.731 "num_base_bdevs_discovered": 2, 00:26:03.731 "num_base_bdevs_operational": 2, 00:26:03.731 "base_bdevs_list": [ 00:26:03.731 { 00:26:03.731 "name": "spare", 00:26:03.732 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:26:03.732 "is_configured": true, 00:26:03.732 "data_offset": 2048, 00:26:03.732 "data_size": 63488 00:26:03.732 }, 00:26:03.732 { 00:26:03.732 "name": "BaseBdev2", 00:26:03.732 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:03.732 "is_configured": true, 00:26:03.732 "data_offset": 2048, 00:26:03.732 "data_size": 63488 00:26:03.732 } 00:26:03.732 ] 00:26:03.732 }' 00:26:03.732 12:07:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:03.732 12:07:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:04.300 12:07:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:04.560 [2024-07-15 12:07:17.929257] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:04.560 [2024-07-15 12:07:17.929300] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:04.560 00:26:04.560 Latency(us) 00:26:04.560 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:04.560 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:26:04.560 raid_bdev1 : 12.56 97.01 291.03 0.00 0.00 13910.12 284.94 119446.48 00:26:04.560 =================================================================================================================== 00:26:04.560 Total : 97.01 291.03 0.00 0.00 13910.12 284.94 119446.48 00:26:04.560 [2024-07-15 12:07:17.985330] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:04.560 [2024-07-15 12:07:17.985357] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:04.560 [2024-07-15 12:07:17.985433] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:04.560 [2024-07-15 12:07:17.985445] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe7dab0 name raid_bdev1, state offline 00:26:04.560 0 00:26:04.560 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.560 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:04.818 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:26:05.385 /dev/nbd0 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:05.385 1+0 records in 00:26:05.385 1+0 records out 00:26:05.385 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265675 s, 15.4 MB/s 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:05.385 12:07:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:26:05.644 /dev/nbd1 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:05.644 1+0 records in 00:26:05.644 1+0 records out 00:26:05.644 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311614 s, 13.1 MB/s 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:05.644 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:05.902 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:06.161 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:06.420 12:07:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:06.680 [2024-07-15 12:07:20.062353] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:06.680 [2024-07-15 12:07:20.062409] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:06.680 [2024-07-15 12:07:20.062431] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe79860 00:26:06.680 [2024-07-15 12:07:20.062443] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:06.680 [2024-07-15 12:07:20.064193] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:06.680 [2024-07-15 12:07:20.064223] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:06.680 [2024-07-15 12:07:20.064306] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:06.680 [2024-07-15 12:07:20.064332] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:06.680 [2024-07-15 12:07:20.064437] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:06.680 spare 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.680 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.680 [2024-07-15 12:07:20.164751] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xccb260 00:26:06.680 [2024-07-15 12:07:20.164770] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:06.680 [2024-07-15 12:07:20.164964] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe7aef0 00:26:06.680 [2024-07-15 12:07:20.165117] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xccb260 00:26:06.680 [2024-07-15 12:07:20.165127] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xccb260 00:26:06.680 [2024-07-15 12:07:20.165240] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:06.939 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:06.939 "name": "raid_bdev1", 00:26:06.939 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:06.939 "strip_size_kb": 0, 00:26:06.939 "state": "online", 00:26:06.939 "raid_level": "raid1", 00:26:06.939 "superblock": true, 00:26:06.939 "num_base_bdevs": 2, 00:26:06.939 "num_base_bdevs_discovered": 2, 00:26:06.939 "num_base_bdevs_operational": 2, 00:26:06.939 "base_bdevs_list": [ 00:26:06.939 { 00:26:06.939 "name": "spare", 00:26:06.939 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:26:06.939 "is_configured": true, 00:26:06.939 "data_offset": 2048, 00:26:06.939 "data_size": 63488 00:26:06.939 }, 00:26:06.939 { 00:26:06.939 "name": "BaseBdev2", 00:26:06.939 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:06.939 "is_configured": true, 00:26:06.939 "data_offset": 2048, 00:26:06.939 "data_size": 63488 00:26:06.939 } 00:26:06.939 ] 00:26:06.939 }' 00:26:06.939 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:06.939 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:07.506 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:07.506 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.506 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:07.506 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:07.506 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.506 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.506 12:07:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.764 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.764 "name": "raid_bdev1", 00:26:07.764 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:07.764 "strip_size_kb": 0, 00:26:07.764 "state": "online", 00:26:07.764 "raid_level": "raid1", 00:26:07.764 "superblock": true, 00:26:07.764 "num_base_bdevs": 2, 00:26:07.764 "num_base_bdevs_discovered": 2, 00:26:07.764 "num_base_bdevs_operational": 2, 00:26:07.764 "base_bdevs_list": [ 00:26:07.764 { 00:26:07.764 "name": "spare", 00:26:07.764 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:26:07.764 "is_configured": true, 00:26:07.764 "data_offset": 2048, 00:26:07.764 "data_size": 63488 00:26:07.764 }, 00:26:07.764 { 00:26:07.764 "name": "BaseBdev2", 00:26:07.764 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:07.764 "is_configured": true, 00:26:07.764 "data_offset": 2048, 00:26:07.764 "data_size": 63488 00:26:07.764 } 00:26:07.764 ] 00:26:07.764 }' 00:26:07.764 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.764 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:07.764 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.764 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:07.764 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.764 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:08.023 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:08.023 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:08.281 [2024-07-15 12:07:21.743471] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.281 12:07:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.540 12:07:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:08.540 "name": "raid_bdev1", 00:26:08.540 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:08.540 "strip_size_kb": 0, 00:26:08.540 "state": "online", 00:26:08.540 "raid_level": "raid1", 00:26:08.540 "superblock": true, 00:26:08.540 "num_base_bdevs": 2, 00:26:08.540 "num_base_bdevs_discovered": 1, 00:26:08.540 "num_base_bdevs_operational": 1, 00:26:08.540 "base_bdevs_list": [ 00:26:08.540 { 00:26:08.540 "name": null, 00:26:08.540 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:08.540 "is_configured": false, 00:26:08.540 "data_offset": 2048, 00:26:08.540 "data_size": 63488 00:26:08.540 }, 00:26:08.540 { 00:26:08.540 "name": "BaseBdev2", 00:26:08.540 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:08.540 "is_configured": true, 00:26:08.540 "data_offset": 2048, 00:26:08.540 "data_size": 63488 00:26:08.540 } 00:26:08.540 ] 00:26:08.540 }' 00:26:08.540 12:07:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:08.540 12:07:22 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:09.108 12:07:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:09.368 [2024-07-15 12:07:22.846656] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:09.368 [2024-07-15 12:07:22.846818] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:09.368 [2024-07-15 12:07:22.846835] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:09.368 [2024-07-15 12:07:22.846863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:09.368 [2024-07-15 12:07:22.852153] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xcd45c0 00:26:09.368 [2024-07-15 12:07:22.854268] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:09.368 12:07:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:10.305 12:07:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:10.305 12:07:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:10.305 12:07:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:10.305 12:07:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:10.305 12:07:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:10.305 12:07:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.305 12:07:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.565 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:10.565 "name": "raid_bdev1", 00:26:10.565 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:10.565 "strip_size_kb": 0, 00:26:10.565 "state": "online", 00:26:10.565 "raid_level": "raid1", 00:26:10.565 "superblock": true, 00:26:10.565 "num_base_bdevs": 2, 00:26:10.565 "num_base_bdevs_discovered": 2, 00:26:10.565 "num_base_bdevs_operational": 2, 00:26:10.565 "process": { 00:26:10.565 "type": "rebuild", 00:26:10.565 "target": "spare", 00:26:10.565 "progress": { 00:26:10.565 "blocks": 24576, 00:26:10.565 "percent": 38 00:26:10.565 } 00:26:10.565 }, 00:26:10.565 "base_bdevs_list": [ 00:26:10.565 { 00:26:10.565 "name": "spare", 00:26:10.565 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:26:10.565 "is_configured": true, 00:26:10.565 "data_offset": 2048, 00:26:10.565 "data_size": 63488 00:26:10.565 }, 00:26:10.565 { 00:26:10.565 "name": "BaseBdev2", 00:26:10.565 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:10.565 "is_configured": true, 00:26:10.565 "data_offset": 2048, 00:26:10.565 "data_size": 63488 00:26:10.565 } 00:26:10.565 ] 00:26:10.565 }' 00:26:10.565 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:10.825 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:10.825 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:10.825 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:10.825 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:11.084 [2024-07-15 12:07:24.445157] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:11.084 [2024-07-15 12:07:24.466893] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:11.084 [2024-07-15 12:07:24.466937] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:11.084 [2024-07-15 12:07:24.466952] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:11.084 [2024-07-15 12:07:24.466960] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.084 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.343 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:11.343 "name": "raid_bdev1", 00:26:11.343 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:11.343 "strip_size_kb": 0, 00:26:11.343 "state": "online", 00:26:11.343 "raid_level": "raid1", 00:26:11.343 "superblock": true, 00:26:11.343 "num_base_bdevs": 2, 00:26:11.343 "num_base_bdevs_discovered": 1, 00:26:11.343 "num_base_bdevs_operational": 1, 00:26:11.343 "base_bdevs_list": [ 00:26:11.343 { 00:26:11.343 "name": null, 00:26:11.343 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.344 "is_configured": false, 00:26:11.344 "data_offset": 2048, 00:26:11.344 "data_size": 63488 00:26:11.344 }, 00:26:11.344 { 00:26:11.344 "name": "BaseBdev2", 00:26:11.344 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:11.344 "is_configured": true, 00:26:11.344 "data_offset": 2048, 00:26:11.344 "data_size": 63488 00:26:11.344 } 00:26:11.344 ] 00:26:11.344 }' 00:26:11.344 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:11.344 12:07:24 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:11.911 12:07:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:12.170 [2024-07-15 12:07:25.558213] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:12.170 [2024-07-15 12:07:25.558270] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:12.170 [2024-07-15 12:07:25.558295] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xccb5e0 00:26:12.170 [2024-07-15 12:07:25.558310] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:12.170 [2024-07-15 12:07:25.558700] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:12.170 [2024-07-15 12:07:25.558719] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:12.170 [2024-07-15 12:07:25.558798] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:12.170 [2024-07-15 12:07:25.558810] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:12.170 [2024-07-15 12:07:25.558820] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:12.170 [2024-07-15 12:07:25.558841] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:12.170 [2024-07-15 12:07:25.564073] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe7d5c0 00:26:12.170 spare 00:26:12.170 [2024-07-15 12:07:25.565537] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:12.170 12:07:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:13.107 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:13.107 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:13.107 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:13.107 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:13.107 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:13.107 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.107 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.366 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:13.366 "name": "raid_bdev1", 00:26:13.366 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:13.366 "strip_size_kb": 0, 00:26:13.366 "state": "online", 00:26:13.366 "raid_level": "raid1", 00:26:13.366 "superblock": true, 00:26:13.366 "num_base_bdevs": 2, 00:26:13.366 "num_base_bdevs_discovered": 2, 00:26:13.366 "num_base_bdevs_operational": 2, 00:26:13.366 "process": { 00:26:13.366 "type": "rebuild", 00:26:13.366 "target": "spare", 00:26:13.366 "progress": { 00:26:13.366 "blocks": 22528, 00:26:13.366 "percent": 35 00:26:13.366 } 00:26:13.366 }, 00:26:13.366 "base_bdevs_list": [ 00:26:13.366 { 00:26:13.366 "name": "spare", 00:26:13.366 "uuid": "375fbc92-d99e-5545-ac10-70522e4c058a", 00:26:13.366 "is_configured": true, 00:26:13.366 "data_offset": 2048, 00:26:13.366 "data_size": 63488 00:26:13.366 }, 00:26:13.366 { 00:26:13.366 "name": "BaseBdev2", 00:26:13.366 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:13.366 "is_configured": true, 00:26:13.366 "data_offset": 2048, 00:26:13.366 "data_size": 63488 00:26:13.366 } 00:26:13.366 ] 00:26:13.366 }' 00:26:13.366 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:13.366 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:13.366 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:13.366 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:13.366 12:07:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:13.625 [2024-07-15 12:07:27.097040] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:13.625 [2024-07-15 12:07:27.178155] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:13.625 [2024-07-15 12:07:27.178199] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:13.625 [2024-07-15 12:07:27.178214] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:13.626 [2024-07-15 12:07:27.178222] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.626 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:13.884 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:13.884 "name": "raid_bdev1", 00:26:13.884 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:13.884 "strip_size_kb": 0, 00:26:13.884 "state": "online", 00:26:13.884 "raid_level": "raid1", 00:26:13.884 "superblock": true, 00:26:13.884 "num_base_bdevs": 2, 00:26:13.884 "num_base_bdevs_discovered": 1, 00:26:13.884 "num_base_bdevs_operational": 1, 00:26:13.884 "base_bdevs_list": [ 00:26:13.884 { 00:26:13.884 "name": null, 00:26:13.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:13.884 "is_configured": false, 00:26:13.884 "data_offset": 2048, 00:26:13.884 "data_size": 63488 00:26:13.884 }, 00:26:13.884 { 00:26:13.884 "name": "BaseBdev2", 00:26:13.884 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:13.884 "is_configured": true, 00:26:13.884 "data_offset": 2048, 00:26:13.884 "data_size": 63488 00:26:13.884 } 00:26:13.884 ] 00:26:13.884 }' 00:26:13.884 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:13.884 12:07:27 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:14.822 "name": "raid_bdev1", 00:26:14.822 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:14.822 "strip_size_kb": 0, 00:26:14.822 "state": "online", 00:26:14.822 "raid_level": "raid1", 00:26:14.822 "superblock": true, 00:26:14.822 "num_base_bdevs": 2, 00:26:14.822 "num_base_bdevs_discovered": 1, 00:26:14.822 "num_base_bdevs_operational": 1, 00:26:14.822 "base_bdevs_list": [ 00:26:14.822 { 00:26:14.822 "name": null, 00:26:14.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:14.822 "is_configured": false, 00:26:14.822 "data_offset": 2048, 00:26:14.822 "data_size": 63488 00:26:14.822 }, 00:26:14.822 { 00:26:14.822 "name": "BaseBdev2", 00:26:14.822 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:14.822 "is_configured": true, 00:26:14.822 "data_offset": 2048, 00:26:14.822 "data_size": 63488 00:26:14.822 } 00:26:14.822 ] 00:26:14.822 }' 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:14.822 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:15.082 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:15.341 [2024-07-15 12:07:28.855526] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:15.341 [2024-07-15 12:07:28.855580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:15.341 [2024-07-15 12:07:28.855599] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd4bf0 00:26:15.341 [2024-07-15 12:07:28.855612] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:15.341 [2024-07-15 12:07:28.855993] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:15.341 [2024-07-15 12:07:28.856016] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:15.341 [2024-07-15 12:07:28.856086] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:15.341 [2024-07-15 12:07:28.856099] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:15.341 [2024-07-15 12:07:28.856109] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:15.341 BaseBdev1 00:26:15.341 12:07:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:16.281 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:16.540 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:16.540 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:16.541 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:16.541 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:16.541 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:16.541 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.541 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.541 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.541 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.541 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.541 12:07:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.541 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.541 "name": "raid_bdev1", 00:26:16.541 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:16.541 "strip_size_kb": 0, 00:26:16.541 "state": "online", 00:26:16.541 "raid_level": "raid1", 00:26:16.541 "superblock": true, 00:26:16.541 "num_base_bdevs": 2, 00:26:16.541 "num_base_bdevs_discovered": 1, 00:26:16.541 "num_base_bdevs_operational": 1, 00:26:16.541 "base_bdevs_list": [ 00:26:16.541 { 00:26:16.541 "name": null, 00:26:16.541 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.541 "is_configured": false, 00:26:16.541 "data_offset": 2048, 00:26:16.541 "data_size": 63488 00:26:16.541 }, 00:26:16.541 { 00:26:16.541 "name": "BaseBdev2", 00:26:16.541 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:16.541 "is_configured": true, 00:26:16.541 "data_offset": 2048, 00:26:16.541 "data_size": 63488 00:26:16.541 } 00:26:16.541 ] 00:26:16.541 }' 00:26:16.541 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.541 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:17.480 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:17.480 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:17.480 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:17.480 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:17.480 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:17.480 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.480 12:07:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:17.480 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:17.480 "name": "raid_bdev1", 00:26:17.480 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:17.480 "strip_size_kb": 0, 00:26:17.480 "state": "online", 00:26:17.480 "raid_level": "raid1", 00:26:17.480 "superblock": true, 00:26:17.480 "num_base_bdevs": 2, 00:26:17.480 "num_base_bdevs_discovered": 1, 00:26:17.480 "num_base_bdevs_operational": 1, 00:26:17.480 "base_bdevs_list": [ 00:26:17.480 { 00:26:17.480 "name": null, 00:26:17.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:17.480 "is_configured": false, 00:26:17.480 "data_offset": 2048, 00:26:17.480 "data_size": 63488 00:26:17.480 }, 00:26:17.480 { 00:26:17.480 "name": "BaseBdev2", 00:26:17.480 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:17.480 "is_configured": true, 00:26:17.480 "data_offset": 2048, 00:26:17.480 "data_size": 63488 00:26:17.480 } 00:26:17.480 ] 00:26:17.480 }' 00:26:17.480 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:17.480 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:17.740 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:17.999 [2024-07-15 12:07:31.354689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:17.999 [2024-07-15 12:07:31.354822] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:17.999 [2024-07-15 12:07:31.354838] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:17.999 request: 00:26:17.999 { 00:26:17.999 "base_bdev": "BaseBdev1", 00:26:17.999 "raid_bdev": "raid_bdev1", 00:26:17.999 "method": "bdev_raid_add_base_bdev", 00:26:17.999 "req_id": 1 00:26:17.999 } 00:26:17.999 Got JSON-RPC error response 00:26:17.999 response: 00:26:17.999 { 00:26:17.999 "code": -22, 00:26:17.999 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:17.999 } 00:26:17.999 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:26:17.999 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:26:17.999 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:26:17.999 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:26:17.999 12:07:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.939 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.198 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:19.198 "name": "raid_bdev1", 00:26:19.198 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:19.198 "strip_size_kb": 0, 00:26:19.198 "state": "online", 00:26:19.198 "raid_level": "raid1", 00:26:19.198 "superblock": true, 00:26:19.198 "num_base_bdevs": 2, 00:26:19.198 "num_base_bdevs_discovered": 1, 00:26:19.198 "num_base_bdevs_operational": 1, 00:26:19.198 "base_bdevs_list": [ 00:26:19.198 { 00:26:19.198 "name": null, 00:26:19.198 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.198 "is_configured": false, 00:26:19.198 "data_offset": 2048, 00:26:19.198 "data_size": 63488 00:26:19.198 }, 00:26:19.198 { 00:26:19.198 "name": "BaseBdev2", 00:26:19.198 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:19.198 "is_configured": true, 00:26:19.198 "data_offset": 2048, 00:26:19.198 "data_size": 63488 00:26:19.198 } 00:26:19.198 ] 00:26:19.198 }' 00:26:19.198 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:19.198 12:07:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:20.135 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:20.135 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:20.136 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:20.136 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:20.136 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:20.136 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.136 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:20.394 "name": "raid_bdev1", 00:26:20.394 "uuid": "bbf90ca2-ef0e-4113-8c7d-0b793b6ec966", 00:26:20.394 "strip_size_kb": 0, 00:26:20.394 "state": "online", 00:26:20.394 "raid_level": "raid1", 00:26:20.394 "superblock": true, 00:26:20.394 "num_base_bdevs": 2, 00:26:20.394 "num_base_bdevs_discovered": 1, 00:26:20.394 "num_base_bdevs_operational": 1, 00:26:20.394 "base_bdevs_list": [ 00:26:20.394 { 00:26:20.394 "name": null, 00:26:20.394 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:20.394 "is_configured": false, 00:26:20.394 "data_offset": 2048, 00:26:20.394 "data_size": 63488 00:26:20.394 }, 00:26:20.394 { 00:26:20.394 "name": "BaseBdev2", 00:26:20.394 "uuid": "e25ed76a-93ba-551f-bcf3-5ba3990a8172", 00:26:20.394 "is_configured": true, 00:26:20.394 "data_offset": 2048, 00:26:20.394 "data_size": 63488 00:26:20.394 } 00:26:20.394 ] 00:26:20.394 }' 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1575446 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1575446 ']' 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1575446 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1575446 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1575446' 00:26:20.394 killing process with pid 1575446 00:26:20.394 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1575446 00:26:20.394 Received shutdown signal, test time was about 28.449396 seconds 00:26:20.394 00:26:20.394 Latency(us) 00:26:20.394 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:20.394 =================================================================================================================== 00:26:20.394 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:26:20.394 [2024-07-15 12:07:33.913808] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:20.395 [2024-07-15 12:07:33.913918] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:20.395 [2024-07-15 12:07:33.913971] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:20.395 [2024-07-15 12:07:33.913984] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xccb260 name raid_bdev1, state offline 00:26:20.395 12:07:33 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1575446 00:26:20.395 [2024-07-15 12:07:33.937484] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:20.654 12:07:34 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:26:20.654 00:26:20.654 real 0m33.176s 00:26:20.654 user 0m51.995s 00:26:20.654 sys 0m4.986s 00:26:20.654 12:07:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:20.654 12:07:34 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:26:20.654 ************************************ 00:26:20.654 END TEST raid_rebuild_test_sb_io 00:26:20.654 ************************************ 00:26:20.654 12:07:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:20.654 12:07:34 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:26:20.654 12:07:34 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:26:20.654 12:07:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:20.654 12:07:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:20.654 12:07:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:20.914 ************************************ 00:26:20.914 START TEST raid_rebuild_test 00:26:20.914 ************************************ 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false false true 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=1580112 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 1580112 /var/tmp/spdk-raid.sock 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@829 -- # '[' -z 1580112 ']' 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:20.914 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:20.914 12:07:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:20.914 [2024-07-15 12:07:34.329005] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:26:20.914 [2024-07-15 12:07:34.329069] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1580112 ] 00:26:20.914 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:20.914 Zero copy mechanism will not be used. 00:26:20.914 [2024-07-15 12:07:34.446654] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:21.174 [2024-07-15 12:07:34.553622] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:21.174 [2024-07-15 12:07:34.614103] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:21.174 [2024-07-15 12:07:34.614136] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:21.433 12:07:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:21.433 12:07:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@862 -- # return 0 00:26:21.433 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:21.433 12:07:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:21.433 BaseBdev1_malloc 00:26:21.692 12:07:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:21.692 [2024-07-15 12:07:35.274836] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:21.692 [2024-07-15 12:07:35.274885] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:21.692 [2024-07-15 12:07:35.274910] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18bd9c0 00:26:21.692 [2024-07-15 12:07:35.274922] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:21.692 [2024-07-15 12:07:35.276657] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:21.692 [2024-07-15 12:07:35.276693] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:21.692 BaseBdev1 00:26:21.951 12:07:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:21.951 12:07:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:21.951 BaseBdev2_malloc 00:26:22.211 12:07:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:22.211 [2024-07-15 12:07:35.773449] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:22.211 [2024-07-15 12:07:35.773496] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:22.211 [2024-07-15 12:07:35.773519] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18be510 00:26:22.211 [2024-07-15 12:07:35.773532] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:22.211 [2024-07-15 12:07:35.775135] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:22.211 [2024-07-15 12:07:35.775165] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:22.211 BaseBdev2 00:26:22.211 12:07:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:22.211 12:07:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:22.470 BaseBdev3_malloc 00:26:22.470 12:07:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:22.730 [2024-07-15 12:07:36.259332] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:22.730 [2024-07-15 12:07:36.259379] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:22.730 [2024-07-15 12:07:36.259400] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a66310 00:26:22.730 [2024-07-15 12:07:36.259413] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:22.730 [2024-07-15 12:07:36.260953] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:22.730 [2024-07-15 12:07:36.260981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:22.730 BaseBdev3 00:26:22.730 12:07:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:22.730 12:07:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:22.990 BaseBdev4_malloc 00:26:22.990 12:07:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:23.249 [2024-07-15 12:07:36.753210] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:23.249 [2024-07-15 12:07:36.753260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:23.249 [2024-07-15 12:07:36.753283] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a667f0 00:26:23.249 [2024-07-15 12:07:36.753295] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:23.249 [2024-07-15 12:07:36.754863] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:23.249 [2024-07-15 12:07:36.754891] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:23.249 BaseBdev4 00:26:23.249 12:07:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:23.507 spare_malloc 00:26:23.507 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:23.809 spare_delay 00:26:23.809 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:24.097 [2024-07-15 12:07:37.476872] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:24.097 [2024-07-15 12:07:37.476920] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:24.097 [2024-07-15 12:07:37.476944] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18b6700 00:26:24.097 [2024-07-15 12:07:37.476956] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:24.097 [2024-07-15 12:07:37.478562] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:24.097 [2024-07-15 12:07:37.478593] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:24.097 spare 00:26:24.097 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:24.356 [2024-07-15 12:07:37.721533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:24.356 [2024-07-15 12:07:37.722869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:24.356 [2024-07-15 12:07:37.722924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:24.356 [2024-07-15 12:07:37.722975] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:24.356 [2024-07-15 12:07:37.723055] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18b7dc0 00:26:24.356 [2024-07-15 12:07:37.723065] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:26:24.356 [2024-07-15 12:07:37.723280] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18bbf70 00:26:24.356 [2024-07-15 12:07:37.723431] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18b7dc0 00:26:24.356 [2024-07-15 12:07:37.723441] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18b7dc0 00:26:24.356 [2024-07-15 12:07:37.723560] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.356 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.616 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:24.616 "name": "raid_bdev1", 00:26:24.616 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:24.616 "strip_size_kb": 0, 00:26:24.616 "state": "online", 00:26:24.616 "raid_level": "raid1", 00:26:24.616 "superblock": false, 00:26:24.616 "num_base_bdevs": 4, 00:26:24.616 "num_base_bdevs_discovered": 4, 00:26:24.616 "num_base_bdevs_operational": 4, 00:26:24.616 "base_bdevs_list": [ 00:26:24.616 { 00:26:24.616 "name": "BaseBdev1", 00:26:24.616 "uuid": "eeae0f79-a2e4-529d-86ae-a9ce855febe8", 00:26:24.616 "is_configured": true, 00:26:24.616 "data_offset": 0, 00:26:24.616 "data_size": 65536 00:26:24.616 }, 00:26:24.616 { 00:26:24.616 "name": "BaseBdev2", 00:26:24.616 "uuid": "5d6dfcc0-c760-5237-8511-cd3330af0a38", 00:26:24.616 "is_configured": true, 00:26:24.616 "data_offset": 0, 00:26:24.616 "data_size": 65536 00:26:24.616 }, 00:26:24.616 { 00:26:24.616 "name": "BaseBdev3", 00:26:24.616 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:24.616 "is_configured": true, 00:26:24.616 "data_offset": 0, 00:26:24.616 "data_size": 65536 00:26:24.616 }, 00:26:24.616 { 00:26:24.616 "name": "BaseBdev4", 00:26:24.616 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:24.616 "is_configured": true, 00:26:24.616 "data_offset": 0, 00:26:24.616 "data_size": 65536 00:26:24.616 } 00:26:24.616 ] 00:26:24.616 }' 00:26:24.616 12:07:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:24.616 12:07:37 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:25.184 12:07:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:25.184 12:07:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:25.444 [2024-07-15 12:07:38.788656] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:25.444 12:07:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:26:25.444 12:07:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.444 12:07:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:25.704 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:25.704 [2024-07-15 12:07:39.273675] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18b8fc0 00:26:25.704 /dev/nbd0 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:25.963 1+0 records in 00:26:25.963 1+0 records out 00:26:25.963 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245337 s, 16.7 MB/s 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:25.963 12:07:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:26:34.080 65536+0 records in 00:26:34.080 65536+0 records out 00:26:34.080 33554432 bytes (34 MB, 32 MiB) copied, 8.10172 s, 4.1 MB/s 00:26:34.080 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:34.080 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:34.080 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:34.080 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:34.080 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:26:34.080 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:34.080 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:34.339 [2024-07-15 12:07:47.702998] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:34.339 [2024-07-15 12:07:47.867453] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.339 12:07:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:34.597 12:07:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:34.597 "name": "raid_bdev1", 00:26:34.597 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:34.597 "strip_size_kb": 0, 00:26:34.597 "state": "online", 00:26:34.597 "raid_level": "raid1", 00:26:34.597 "superblock": false, 00:26:34.597 "num_base_bdevs": 4, 00:26:34.597 "num_base_bdevs_discovered": 3, 00:26:34.597 "num_base_bdevs_operational": 3, 00:26:34.597 "base_bdevs_list": [ 00:26:34.597 { 00:26:34.597 "name": null, 00:26:34.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:34.597 "is_configured": false, 00:26:34.597 "data_offset": 0, 00:26:34.597 "data_size": 65536 00:26:34.597 }, 00:26:34.597 { 00:26:34.597 "name": "BaseBdev2", 00:26:34.597 "uuid": "5d6dfcc0-c760-5237-8511-cd3330af0a38", 00:26:34.597 "is_configured": true, 00:26:34.598 "data_offset": 0, 00:26:34.598 "data_size": 65536 00:26:34.598 }, 00:26:34.598 { 00:26:34.598 "name": "BaseBdev3", 00:26:34.598 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:34.598 "is_configured": true, 00:26:34.598 "data_offset": 0, 00:26:34.598 "data_size": 65536 00:26:34.598 }, 00:26:34.598 { 00:26:34.598 "name": "BaseBdev4", 00:26:34.598 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:34.598 "is_configured": true, 00:26:34.598 "data_offset": 0, 00:26:34.598 "data_size": 65536 00:26:34.598 } 00:26:34.598 ] 00:26:34.598 }' 00:26:34.598 12:07:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:34.598 12:07:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:35.164 12:07:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:35.422 [2024-07-15 12:07:48.978448] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:35.422 [2024-07-15 12:07:48.982517] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18bce80 00:26:35.422 [2024-07-15 12:07:48.984770] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:35.422 12:07:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:36.799 "name": "raid_bdev1", 00:26:36.799 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:36.799 "strip_size_kb": 0, 00:26:36.799 "state": "online", 00:26:36.799 "raid_level": "raid1", 00:26:36.799 "superblock": false, 00:26:36.799 "num_base_bdevs": 4, 00:26:36.799 "num_base_bdevs_discovered": 4, 00:26:36.799 "num_base_bdevs_operational": 4, 00:26:36.799 "process": { 00:26:36.799 "type": "rebuild", 00:26:36.799 "target": "spare", 00:26:36.799 "progress": { 00:26:36.799 "blocks": 24576, 00:26:36.799 "percent": 37 00:26:36.799 } 00:26:36.799 }, 00:26:36.799 "base_bdevs_list": [ 00:26:36.799 { 00:26:36.799 "name": "spare", 00:26:36.799 "uuid": "34aac5d9-12a1-5aa9-b1af-6c28f933a43d", 00:26:36.799 "is_configured": true, 00:26:36.799 "data_offset": 0, 00:26:36.799 "data_size": 65536 00:26:36.799 }, 00:26:36.799 { 00:26:36.799 "name": "BaseBdev2", 00:26:36.799 "uuid": "5d6dfcc0-c760-5237-8511-cd3330af0a38", 00:26:36.799 "is_configured": true, 00:26:36.799 "data_offset": 0, 00:26:36.799 "data_size": 65536 00:26:36.799 }, 00:26:36.799 { 00:26:36.799 "name": "BaseBdev3", 00:26:36.799 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:36.799 "is_configured": true, 00:26:36.799 "data_offset": 0, 00:26:36.799 "data_size": 65536 00:26:36.799 }, 00:26:36.799 { 00:26:36.799 "name": "BaseBdev4", 00:26:36.799 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:36.799 "is_configured": true, 00:26:36.799 "data_offset": 0, 00:26:36.799 "data_size": 65536 00:26:36.799 } 00:26:36.799 ] 00:26:36.799 }' 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:36.799 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:37.058 [2024-07-15 12:07:50.587727] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:37.058 [2024-07-15 12:07:50.597640] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:37.058 [2024-07-15 12:07:50.597691] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:37.058 [2024-07-15 12:07:50.597709] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:37.058 [2024-07-15 12:07:50.597717] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.058 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:37.316 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:37.317 "name": "raid_bdev1", 00:26:37.317 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:37.317 "strip_size_kb": 0, 00:26:37.317 "state": "online", 00:26:37.317 "raid_level": "raid1", 00:26:37.317 "superblock": false, 00:26:37.317 "num_base_bdevs": 4, 00:26:37.317 "num_base_bdevs_discovered": 3, 00:26:37.317 "num_base_bdevs_operational": 3, 00:26:37.317 "base_bdevs_list": [ 00:26:37.317 { 00:26:37.317 "name": null, 00:26:37.317 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:37.317 "is_configured": false, 00:26:37.317 "data_offset": 0, 00:26:37.317 "data_size": 65536 00:26:37.317 }, 00:26:37.317 { 00:26:37.317 "name": "BaseBdev2", 00:26:37.317 "uuid": "5d6dfcc0-c760-5237-8511-cd3330af0a38", 00:26:37.317 "is_configured": true, 00:26:37.317 "data_offset": 0, 00:26:37.317 "data_size": 65536 00:26:37.317 }, 00:26:37.317 { 00:26:37.317 "name": "BaseBdev3", 00:26:37.317 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:37.317 "is_configured": true, 00:26:37.317 "data_offset": 0, 00:26:37.317 "data_size": 65536 00:26:37.317 }, 00:26:37.317 { 00:26:37.317 "name": "BaseBdev4", 00:26:37.317 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:37.317 "is_configured": true, 00:26:37.317 "data_offset": 0, 00:26:37.317 "data_size": 65536 00:26:37.317 } 00:26:37.317 ] 00:26:37.317 }' 00:26:37.317 12:07:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:37.317 12:07:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:37.883 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:37.883 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:37.883 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:37.883 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:37.883 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:37.883 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.883 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.141 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:38.142 "name": "raid_bdev1", 00:26:38.142 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:38.142 "strip_size_kb": 0, 00:26:38.142 "state": "online", 00:26:38.142 "raid_level": "raid1", 00:26:38.142 "superblock": false, 00:26:38.142 "num_base_bdevs": 4, 00:26:38.142 "num_base_bdevs_discovered": 3, 00:26:38.142 "num_base_bdevs_operational": 3, 00:26:38.142 "base_bdevs_list": [ 00:26:38.142 { 00:26:38.142 "name": null, 00:26:38.142 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:38.142 "is_configured": false, 00:26:38.142 "data_offset": 0, 00:26:38.142 "data_size": 65536 00:26:38.142 }, 00:26:38.142 { 00:26:38.142 "name": "BaseBdev2", 00:26:38.142 "uuid": "5d6dfcc0-c760-5237-8511-cd3330af0a38", 00:26:38.142 "is_configured": true, 00:26:38.142 "data_offset": 0, 00:26:38.142 "data_size": 65536 00:26:38.142 }, 00:26:38.142 { 00:26:38.142 "name": "BaseBdev3", 00:26:38.142 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:38.142 "is_configured": true, 00:26:38.142 "data_offset": 0, 00:26:38.142 "data_size": 65536 00:26:38.142 }, 00:26:38.142 { 00:26:38.142 "name": "BaseBdev4", 00:26:38.142 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:38.142 "is_configured": true, 00:26:38.142 "data_offset": 0, 00:26:38.142 "data_size": 65536 00:26:38.142 } 00:26:38.142 ] 00:26:38.142 }' 00:26:38.142 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:38.142 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:38.142 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:38.400 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:38.400 12:07:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:38.400 [2024-07-15 12:07:51.994002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:38.659 [2024-07-15 12:07:51.998053] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18b9550 00:26:38.659 [2024-07-15 12:07:51.999550] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:38.659 12:07:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:39.596 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:39.596 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:39.596 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:39.596 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:39.596 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:39.596 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.596 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:39.853 "name": "raid_bdev1", 00:26:39.853 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:39.853 "strip_size_kb": 0, 00:26:39.853 "state": "online", 00:26:39.853 "raid_level": "raid1", 00:26:39.853 "superblock": false, 00:26:39.853 "num_base_bdevs": 4, 00:26:39.853 "num_base_bdevs_discovered": 4, 00:26:39.853 "num_base_bdevs_operational": 4, 00:26:39.853 "process": { 00:26:39.853 "type": "rebuild", 00:26:39.853 "target": "spare", 00:26:39.853 "progress": { 00:26:39.853 "blocks": 24576, 00:26:39.853 "percent": 37 00:26:39.853 } 00:26:39.853 }, 00:26:39.853 "base_bdevs_list": [ 00:26:39.853 { 00:26:39.853 "name": "spare", 00:26:39.853 "uuid": "34aac5d9-12a1-5aa9-b1af-6c28f933a43d", 00:26:39.853 "is_configured": true, 00:26:39.853 "data_offset": 0, 00:26:39.853 "data_size": 65536 00:26:39.853 }, 00:26:39.853 { 00:26:39.853 "name": "BaseBdev2", 00:26:39.853 "uuid": "5d6dfcc0-c760-5237-8511-cd3330af0a38", 00:26:39.853 "is_configured": true, 00:26:39.853 "data_offset": 0, 00:26:39.853 "data_size": 65536 00:26:39.853 }, 00:26:39.853 { 00:26:39.853 "name": "BaseBdev3", 00:26:39.853 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:39.853 "is_configured": true, 00:26:39.853 "data_offset": 0, 00:26:39.853 "data_size": 65536 00:26:39.853 }, 00:26:39.853 { 00:26:39.853 "name": "BaseBdev4", 00:26:39.853 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:39.853 "is_configured": true, 00:26:39.853 "data_offset": 0, 00:26:39.853 "data_size": 65536 00:26:39.853 } 00:26:39.853 ] 00:26:39.853 }' 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:26:39.853 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:26:40.111 [2024-07-15 12:07:53.583328] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:40.111 [2024-07-15 12:07:53.612196] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x18b9550 00:26:40.111 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:26:40.111 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:26:40.111 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:40.111 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:40.111 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:40.111 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:40.111 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:40.111 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.111 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.369 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:40.369 "name": "raid_bdev1", 00:26:40.369 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:40.369 "strip_size_kb": 0, 00:26:40.369 "state": "online", 00:26:40.369 "raid_level": "raid1", 00:26:40.369 "superblock": false, 00:26:40.369 "num_base_bdevs": 4, 00:26:40.369 "num_base_bdevs_discovered": 3, 00:26:40.369 "num_base_bdevs_operational": 3, 00:26:40.369 "process": { 00:26:40.369 "type": "rebuild", 00:26:40.369 "target": "spare", 00:26:40.369 "progress": { 00:26:40.369 "blocks": 36864, 00:26:40.369 "percent": 56 00:26:40.369 } 00:26:40.369 }, 00:26:40.369 "base_bdevs_list": [ 00:26:40.369 { 00:26:40.369 "name": "spare", 00:26:40.369 "uuid": "34aac5d9-12a1-5aa9-b1af-6c28f933a43d", 00:26:40.369 "is_configured": true, 00:26:40.369 "data_offset": 0, 00:26:40.369 "data_size": 65536 00:26:40.369 }, 00:26:40.369 { 00:26:40.369 "name": null, 00:26:40.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.369 "is_configured": false, 00:26:40.369 "data_offset": 0, 00:26:40.369 "data_size": 65536 00:26:40.369 }, 00:26:40.369 { 00:26:40.369 "name": "BaseBdev3", 00:26:40.369 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:40.369 "is_configured": true, 00:26:40.369 "data_offset": 0, 00:26:40.369 "data_size": 65536 00:26:40.369 }, 00:26:40.369 { 00:26:40.369 "name": "BaseBdev4", 00:26:40.369 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:40.369 "is_configured": true, 00:26:40.369 "data_offset": 0, 00:26:40.369 "data_size": 65536 00:26:40.369 } 00:26:40.369 ] 00:26:40.369 }' 00:26:40.369 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:40.369 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:40.369 12:07:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=908 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:40.628 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:40.887 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:40.887 "name": "raid_bdev1", 00:26:40.887 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:40.887 "strip_size_kb": 0, 00:26:40.887 "state": "online", 00:26:40.887 "raid_level": "raid1", 00:26:40.887 "superblock": false, 00:26:40.887 "num_base_bdevs": 4, 00:26:40.887 "num_base_bdevs_discovered": 3, 00:26:40.887 "num_base_bdevs_operational": 3, 00:26:40.887 "process": { 00:26:40.887 "type": "rebuild", 00:26:40.887 "target": "spare", 00:26:40.887 "progress": { 00:26:40.887 "blocks": 45056, 00:26:40.887 "percent": 68 00:26:40.887 } 00:26:40.887 }, 00:26:40.887 "base_bdevs_list": [ 00:26:40.887 { 00:26:40.887 "name": "spare", 00:26:40.887 "uuid": "34aac5d9-12a1-5aa9-b1af-6c28f933a43d", 00:26:40.887 "is_configured": true, 00:26:40.887 "data_offset": 0, 00:26:40.887 "data_size": 65536 00:26:40.887 }, 00:26:40.887 { 00:26:40.887 "name": null, 00:26:40.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:40.887 "is_configured": false, 00:26:40.887 "data_offset": 0, 00:26:40.887 "data_size": 65536 00:26:40.887 }, 00:26:40.887 { 00:26:40.887 "name": "BaseBdev3", 00:26:40.887 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:40.887 "is_configured": true, 00:26:40.887 "data_offset": 0, 00:26:40.887 "data_size": 65536 00:26:40.887 }, 00:26:40.887 { 00:26:40.887 "name": "BaseBdev4", 00:26:40.887 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:40.887 "is_configured": true, 00:26:40.887 "data_offset": 0, 00:26:40.887 "data_size": 65536 00:26:40.887 } 00:26:40.887 ] 00:26:40.887 }' 00:26:40.887 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:40.887 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:40.887 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:40.887 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:40.887 12:07:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:41.824 [2024-07-15 12:07:55.224788] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:41.824 [2024-07-15 12:07:55.224851] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:41.824 [2024-07-15 12:07:55.224890] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:41.824 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:41.824 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:41.824 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:41.824 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:41.824 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:41.824 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:41.824 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.824 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.083 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:42.083 "name": "raid_bdev1", 00:26:42.083 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:42.083 "strip_size_kb": 0, 00:26:42.083 "state": "online", 00:26:42.083 "raid_level": "raid1", 00:26:42.083 "superblock": false, 00:26:42.083 "num_base_bdevs": 4, 00:26:42.083 "num_base_bdevs_discovered": 3, 00:26:42.083 "num_base_bdevs_operational": 3, 00:26:42.083 "base_bdevs_list": [ 00:26:42.083 { 00:26:42.083 "name": "spare", 00:26:42.083 "uuid": "34aac5d9-12a1-5aa9-b1af-6c28f933a43d", 00:26:42.083 "is_configured": true, 00:26:42.083 "data_offset": 0, 00:26:42.083 "data_size": 65536 00:26:42.083 }, 00:26:42.083 { 00:26:42.083 "name": null, 00:26:42.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.083 "is_configured": false, 00:26:42.083 "data_offset": 0, 00:26:42.083 "data_size": 65536 00:26:42.083 }, 00:26:42.083 { 00:26:42.083 "name": "BaseBdev3", 00:26:42.083 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:42.083 "is_configured": true, 00:26:42.083 "data_offset": 0, 00:26:42.083 "data_size": 65536 00:26:42.083 }, 00:26:42.083 { 00:26:42.083 "name": "BaseBdev4", 00:26:42.083 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:42.083 "is_configured": true, 00:26:42.083 "data_offset": 0, 00:26:42.083 "data_size": 65536 00:26:42.083 } 00:26:42.083 ] 00:26:42.083 }' 00:26:42.083 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:42.083 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:42.083 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:42.342 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:42.342 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:26:42.342 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:42.342 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:42.342 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:42.342 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:42.342 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:42.342 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.342 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.601 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:42.601 "name": "raid_bdev1", 00:26:42.601 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:42.601 "strip_size_kb": 0, 00:26:42.601 "state": "online", 00:26:42.601 "raid_level": "raid1", 00:26:42.601 "superblock": false, 00:26:42.601 "num_base_bdevs": 4, 00:26:42.601 "num_base_bdevs_discovered": 3, 00:26:42.601 "num_base_bdevs_operational": 3, 00:26:42.601 "base_bdevs_list": [ 00:26:42.601 { 00:26:42.601 "name": "spare", 00:26:42.601 "uuid": "34aac5d9-12a1-5aa9-b1af-6c28f933a43d", 00:26:42.601 "is_configured": true, 00:26:42.601 "data_offset": 0, 00:26:42.601 "data_size": 65536 00:26:42.601 }, 00:26:42.601 { 00:26:42.601 "name": null, 00:26:42.601 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.601 "is_configured": false, 00:26:42.601 "data_offset": 0, 00:26:42.601 "data_size": 65536 00:26:42.601 }, 00:26:42.601 { 00:26:42.601 "name": "BaseBdev3", 00:26:42.601 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:42.601 "is_configured": true, 00:26:42.601 "data_offset": 0, 00:26:42.601 "data_size": 65536 00:26:42.601 }, 00:26:42.601 { 00:26:42.601 "name": "BaseBdev4", 00:26:42.601 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:42.601 "is_configured": true, 00:26:42.601 "data_offset": 0, 00:26:42.601 "data_size": 65536 00:26:42.601 } 00:26:42.601 ] 00:26:42.601 }' 00:26:42.601 12:07:55 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.601 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:42.864 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:42.864 "name": "raid_bdev1", 00:26:42.864 "uuid": "d3aa7293-e2b5-4c9b-a827-1a9464fa8708", 00:26:42.864 "strip_size_kb": 0, 00:26:42.864 "state": "online", 00:26:42.864 "raid_level": "raid1", 00:26:42.864 "superblock": false, 00:26:42.864 "num_base_bdevs": 4, 00:26:42.864 "num_base_bdevs_discovered": 3, 00:26:42.864 "num_base_bdevs_operational": 3, 00:26:42.864 "base_bdevs_list": [ 00:26:42.864 { 00:26:42.864 "name": "spare", 00:26:42.864 "uuid": "34aac5d9-12a1-5aa9-b1af-6c28f933a43d", 00:26:42.864 "is_configured": true, 00:26:42.864 "data_offset": 0, 00:26:42.864 "data_size": 65536 00:26:42.864 }, 00:26:42.864 { 00:26:42.864 "name": null, 00:26:42.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:42.864 "is_configured": false, 00:26:42.864 "data_offset": 0, 00:26:42.864 "data_size": 65536 00:26:42.864 }, 00:26:42.864 { 00:26:42.864 "name": "BaseBdev3", 00:26:42.864 "uuid": "9fb655d2-1d92-5406-905f-64421a5d599f", 00:26:42.864 "is_configured": true, 00:26:42.864 "data_offset": 0, 00:26:42.864 "data_size": 65536 00:26:42.864 }, 00:26:42.864 { 00:26:42.864 "name": "BaseBdev4", 00:26:42.864 "uuid": "b256cb5a-5f71-5f56-87a7-7601f15373a7", 00:26:42.864 "is_configured": true, 00:26:42.864 "data_offset": 0, 00:26:42.864 "data_size": 65536 00:26:42.864 } 00:26:42.864 ] 00:26:42.864 }' 00:26:42.864 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:42.864 12:07:56 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:43.434 12:07:56 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:43.693 [2024-07-15 12:07:57.166144] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:43.693 [2024-07-15 12:07:57.166174] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:43.693 [2024-07-15 12:07:57.166232] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:43.693 [2024-07-15 12:07:57.166306] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:43.693 [2024-07-15 12:07:57.166319] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18b7dc0 name raid_bdev1, state offline 00:26:43.693 12:07:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.693 12:07:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:43.953 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:44.213 /dev/nbd0 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:44.213 1+0 records in 00:26:44.213 1+0 records out 00:26:44.213 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250857 s, 16.3 MB/s 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:44.213 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:44.473 /dev/nbd1 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local i 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # break 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:44.473 1+0 records in 00:26:44.473 1+0 records out 00:26:44.473 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000328364 s, 12.5 MB/s 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # size=4096 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # return 0 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:44.473 12:07:57 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:26:44.473 12:07:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:44.473 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:44.473 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:44.473 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:44.473 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:26:44.473 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:44.473 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:44.733 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 1580112 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@948 -- # '[' -z 1580112 ']' 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@952 -- # kill -0 1580112 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # uname 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:26:44.993 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1580112 00:26:45.253 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:26:45.253 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:26:45.253 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1580112' 00:26:45.253 killing process with pid 1580112 00:26:45.253 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # kill 1580112 00:26:45.253 Received shutdown signal, test time was about 60.000000 seconds 00:26:45.253 00:26:45.253 Latency(us) 00:26:45.253 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:45.253 =================================================================================================================== 00:26:45.253 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:45.253 [2024-07-15 12:07:58.606143] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:45.253 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@972 -- # wait 1580112 00:26:45.253 [2024-07-15 12:07:58.655746] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:26:45.514 00:26:45.514 real 0m24.623s 00:26:45.514 user 0m32.891s 00:26:45.514 sys 0m5.599s 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # xtrace_disable 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:26:45.514 ************************************ 00:26:45.514 END TEST raid_rebuild_test 00:26:45.514 ************************************ 00:26:45.514 12:07:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:26:45.514 12:07:58 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:26:45.514 12:07:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:26:45.514 12:07:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:26:45.514 12:07:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:45.514 ************************************ 00:26:45.514 START TEST raid_rebuild_test_sb 00:26:45.514 ************************************ 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true false true 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=1583502 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 1583502 /var/tmp/spdk-raid.sock 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@829 -- # '[' -z 1583502 ']' 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local max_retries=100 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:45.514 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # xtrace_disable 00:26:45.514 12:07:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:45.514 [2024-07-15 12:07:59.048237] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:26:45.514 [2024-07-15 12:07:59.048307] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1583502 ] 00:26:45.514 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:45.514 Zero copy mechanism will not be used. 00:26:45.775 [2024-07-15 12:07:59.180470] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:45.775 [2024-07-15 12:07:59.281793] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:45.775 [2024-07-15 12:07:59.350787] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:45.775 [2024-07-15 12:07:59.350820] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:46.714 12:07:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:26:46.714 12:07:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@862 -- # return 0 00:26:46.714 12:07:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:46.714 12:07:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:26:46.714 BaseBdev1_malloc 00:26:46.714 12:08:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:46.973 [2024-07-15 12:08:00.459284] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:46.973 [2024-07-15 12:08:00.459340] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:46.973 [2024-07-15 12:08:00.459363] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18ad9c0 00:26:46.973 [2024-07-15 12:08:00.459375] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:46.973 [2024-07-15 12:08:00.461013] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:46.973 [2024-07-15 12:08:00.461045] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:46.973 BaseBdev1 00:26:46.973 12:08:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:46.973 12:08:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:26:47.233 BaseBdev2_malloc 00:26:47.233 12:08:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:47.493 [2024-07-15 12:08:00.953295] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:47.493 [2024-07-15 12:08:00.953343] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:47.493 [2024-07-15 12:08:00.953364] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18ae510 00:26:47.493 [2024-07-15 12:08:00.953377] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:47.493 [2024-07-15 12:08:00.954741] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:47.493 [2024-07-15 12:08:00.954769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:47.493 BaseBdev2 00:26:47.493 12:08:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:47.493 12:08:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:26:47.752 BaseBdev3_malloc 00:26:47.752 12:08:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:26:48.011 [2024-07-15 12:08:01.455162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:26:48.011 [2024-07-15 12:08:01.455206] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:48.011 [2024-07-15 12:08:01.455225] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a56310 00:26:48.011 [2024-07-15 12:08:01.455237] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:48.011 [2024-07-15 12:08:01.456587] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:48.011 [2024-07-15 12:08:01.456614] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:26:48.011 BaseBdev3 00:26:48.011 12:08:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:48.011 12:08:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:26:48.270 BaseBdev4_malloc 00:26:48.270 12:08:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:26:48.530 [2024-07-15 12:08:01.949036] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:26:48.530 [2024-07-15 12:08:01.949088] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:48.530 [2024-07-15 12:08:01.949107] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1a567f0 00:26:48.530 [2024-07-15 12:08:01.949119] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:48.530 [2024-07-15 12:08:01.950539] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:48.530 [2024-07-15 12:08:01.950567] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:26:48.530 BaseBdev4 00:26:48.530 12:08:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:26:48.789 spare_malloc 00:26:48.789 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:49.048 spare_delay 00:26:49.048 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:49.307 [2024-07-15 12:08:02.691618] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:49.308 [2024-07-15 12:08:02.691665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:49.308 [2024-07-15 12:08:02.691694] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a6700 00:26:49.308 [2024-07-15 12:08:02.691708] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:49.308 [2024-07-15 12:08:02.693204] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:49.308 [2024-07-15 12:08:02.693235] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:49.308 spare 00:26:49.308 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:26:49.568 [2024-07-15 12:08:02.944317] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:49.568 [2024-07-15 12:08:02.945479] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:49.568 [2024-07-15 12:08:02.945530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:26:49.568 [2024-07-15 12:08:02.945576] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:26:49.568 [2024-07-15 12:08:02.945770] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18a7dc0 00:26:49.568 [2024-07-15 12:08:02.945782] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:26:49.568 [2024-07-15 12:08:02.945970] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a52b50 00:26:49.568 [2024-07-15 12:08:02.946113] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18a7dc0 00:26:49.568 [2024-07-15 12:08:02.946123] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18a7dc0 00:26:49.568 [2024-07-15 12:08:02.946211] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.568 12:08:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:49.830 12:08:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:49.830 "name": "raid_bdev1", 00:26:49.830 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:26:49.830 "strip_size_kb": 0, 00:26:49.830 "state": "online", 00:26:49.830 "raid_level": "raid1", 00:26:49.830 "superblock": true, 00:26:49.830 "num_base_bdevs": 4, 00:26:49.830 "num_base_bdevs_discovered": 4, 00:26:49.830 "num_base_bdevs_operational": 4, 00:26:49.830 "base_bdevs_list": [ 00:26:49.830 { 00:26:49.830 "name": "BaseBdev1", 00:26:49.830 "uuid": "11191cfb-929f-5931-87ae-39a5350171fe", 00:26:49.830 "is_configured": true, 00:26:49.830 "data_offset": 2048, 00:26:49.830 "data_size": 63488 00:26:49.830 }, 00:26:49.830 { 00:26:49.830 "name": "BaseBdev2", 00:26:49.830 "uuid": "97f21c28-5e85-53f3-8928-fe5e70258aef", 00:26:49.830 "is_configured": true, 00:26:49.830 "data_offset": 2048, 00:26:49.830 "data_size": 63488 00:26:49.830 }, 00:26:49.830 { 00:26:49.830 "name": "BaseBdev3", 00:26:49.830 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:26:49.830 "is_configured": true, 00:26:49.830 "data_offset": 2048, 00:26:49.830 "data_size": 63488 00:26:49.830 }, 00:26:49.830 { 00:26:49.830 "name": "BaseBdev4", 00:26:49.830 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:26:49.830 "is_configured": true, 00:26:49.830 "data_offset": 2048, 00:26:49.830 "data_size": 63488 00:26:49.830 } 00:26:49.830 ] 00:26:49.830 }' 00:26:49.830 12:08:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:49.830 12:08:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:26:50.400 12:08:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:50.400 12:08:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:50.659 [2024-07-15 12:08:04.023459] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:50.659 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:26:50.659 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.659 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:50.921 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:51.190 [2024-07-15 12:08:04.644853] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18acb20 00:26:51.190 /dev/nbd0 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:51.190 1+0 records in 00:26:51.190 1+0 records out 00:26:51.190 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254388 s, 16.1 MB/s 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:51.190 12:08:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:26:59.352 63488+0 records in 00:26:59.352 63488+0 records out 00:26:59.352 32505856 bytes (33 MB, 31 MiB) copied, 7.71833 s, 4.2 MB/s 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:59.352 [2024-07-15 12:08:12.690364] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:26:59.352 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:59.352 [2024-07-15 12:08:12.935062] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.612 12:08:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:59.612 12:08:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:59.612 "name": "raid_bdev1", 00:26:59.612 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:26:59.612 "strip_size_kb": 0, 00:26:59.612 "state": "online", 00:26:59.612 "raid_level": "raid1", 00:26:59.612 "superblock": true, 00:26:59.612 "num_base_bdevs": 4, 00:26:59.612 "num_base_bdevs_discovered": 3, 00:26:59.612 "num_base_bdevs_operational": 3, 00:26:59.612 "base_bdevs_list": [ 00:26:59.612 { 00:26:59.612 "name": null, 00:26:59.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:59.612 "is_configured": false, 00:26:59.612 "data_offset": 2048, 00:26:59.612 "data_size": 63488 00:26:59.612 }, 00:26:59.612 { 00:26:59.612 "name": "BaseBdev2", 00:26:59.612 "uuid": "97f21c28-5e85-53f3-8928-fe5e70258aef", 00:26:59.612 "is_configured": true, 00:26:59.612 "data_offset": 2048, 00:26:59.612 "data_size": 63488 00:26:59.612 }, 00:26:59.612 { 00:26:59.612 "name": "BaseBdev3", 00:26:59.612 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:26:59.612 "is_configured": true, 00:26:59.612 "data_offset": 2048, 00:26:59.612 "data_size": 63488 00:26:59.612 }, 00:26:59.612 { 00:26:59.612 "name": "BaseBdev4", 00:26:59.612 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:26:59.612 "is_configured": true, 00:26:59.612 "data_offset": 2048, 00:26:59.612 "data_size": 63488 00:26:59.612 } 00:26:59.612 ] 00:26:59.612 }' 00:26:59.612 12:08:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:59.612 12:08:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:00.191 12:08:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:00.450 [2024-07-15 12:08:13.901625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:00.450 [2024-07-15 12:08:13.905754] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a52bf0 00:27:00.451 [2024-07-15 12:08:13.908058] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:00.451 12:08:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:01.390 12:08:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:01.390 12:08:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:01.390 12:08:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:01.390 12:08:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:01.390 12:08:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:01.390 12:08:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.390 12:08:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:01.650 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:01.650 "name": "raid_bdev1", 00:27:01.650 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:01.650 "strip_size_kb": 0, 00:27:01.650 "state": "online", 00:27:01.650 "raid_level": "raid1", 00:27:01.650 "superblock": true, 00:27:01.650 "num_base_bdevs": 4, 00:27:01.650 "num_base_bdevs_discovered": 4, 00:27:01.650 "num_base_bdevs_operational": 4, 00:27:01.650 "process": { 00:27:01.650 "type": "rebuild", 00:27:01.650 "target": "spare", 00:27:01.650 "progress": { 00:27:01.650 "blocks": 24576, 00:27:01.650 "percent": 38 00:27:01.650 } 00:27:01.650 }, 00:27:01.650 "base_bdevs_list": [ 00:27:01.650 { 00:27:01.650 "name": "spare", 00:27:01.650 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:01.650 "is_configured": true, 00:27:01.650 "data_offset": 2048, 00:27:01.650 "data_size": 63488 00:27:01.650 }, 00:27:01.650 { 00:27:01.650 "name": "BaseBdev2", 00:27:01.650 "uuid": "97f21c28-5e85-53f3-8928-fe5e70258aef", 00:27:01.650 "is_configured": true, 00:27:01.650 "data_offset": 2048, 00:27:01.650 "data_size": 63488 00:27:01.650 }, 00:27:01.650 { 00:27:01.650 "name": "BaseBdev3", 00:27:01.650 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:01.650 "is_configured": true, 00:27:01.650 "data_offset": 2048, 00:27:01.650 "data_size": 63488 00:27:01.650 }, 00:27:01.650 { 00:27:01.650 "name": "BaseBdev4", 00:27:01.650 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:01.650 "is_configured": true, 00:27:01.650 "data_offset": 2048, 00:27:01.650 "data_size": 63488 00:27:01.650 } 00:27:01.650 ] 00:27:01.650 }' 00:27:01.650 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:01.650 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:01.650 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:01.910 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:01.910 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:02.170 [2024-07-15 12:08:15.507239] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:02.170 [2024-07-15 12:08:15.520735] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:02.170 [2024-07-15 12:08:15.520781] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:02.170 [2024-07-15 12:08:15.520798] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:02.170 [2024-07-15 12:08:15.520806] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.170 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:02.429 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:02.430 "name": "raid_bdev1", 00:27:02.430 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:02.430 "strip_size_kb": 0, 00:27:02.430 "state": "online", 00:27:02.430 "raid_level": "raid1", 00:27:02.430 "superblock": true, 00:27:02.430 "num_base_bdevs": 4, 00:27:02.430 "num_base_bdevs_discovered": 3, 00:27:02.430 "num_base_bdevs_operational": 3, 00:27:02.430 "base_bdevs_list": [ 00:27:02.430 { 00:27:02.430 "name": null, 00:27:02.430 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:02.430 "is_configured": false, 00:27:02.430 "data_offset": 2048, 00:27:02.430 "data_size": 63488 00:27:02.430 }, 00:27:02.430 { 00:27:02.430 "name": "BaseBdev2", 00:27:02.430 "uuid": "97f21c28-5e85-53f3-8928-fe5e70258aef", 00:27:02.430 "is_configured": true, 00:27:02.430 "data_offset": 2048, 00:27:02.430 "data_size": 63488 00:27:02.430 }, 00:27:02.430 { 00:27:02.430 "name": "BaseBdev3", 00:27:02.430 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:02.430 "is_configured": true, 00:27:02.430 "data_offset": 2048, 00:27:02.430 "data_size": 63488 00:27:02.430 }, 00:27:02.430 { 00:27:02.430 "name": "BaseBdev4", 00:27:02.430 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:02.430 "is_configured": true, 00:27:02.430 "data_offset": 2048, 00:27:02.430 "data_size": 63488 00:27:02.430 } 00:27:02.430 ] 00:27:02.430 }' 00:27:02.430 12:08:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:02.430 12:08:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:02.998 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:02.998 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:02.998 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:02.999 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:02.999 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:02.999 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:02.999 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:02.999 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:02.999 "name": "raid_bdev1", 00:27:02.999 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:02.999 "strip_size_kb": 0, 00:27:02.999 "state": "online", 00:27:02.999 "raid_level": "raid1", 00:27:02.999 "superblock": true, 00:27:02.999 "num_base_bdevs": 4, 00:27:02.999 "num_base_bdevs_discovered": 3, 00:27:02.999 "num_base_bdevs_operational": 3, 00:27:02.999 "base_bdevs_list": [ 00:27:02.999 { 00:27:02.999 "name": null, 00:27:02.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:02.999 "is_configured": false, 00:27:02.999 "data_offset": 2048, 00:27:02.999 "data_size": 63488 00:27:02.999 }, 00:27:02.999 { 00:27:02.999 "name": "BaseBdev2", 00:27:02.999 "uuid": "97f21c28-5e85-53f3-8928-fe5e70258aef", 00:27:02.999 "is_configured": true, 00:27:02.999 "data_offset": 2048, 00:27:02.999 "data_size": 63488 00:27:02.999 }, 00:27:02.999 { 00:27:02.999 "name": "BaseBdev3", 00:27:02.999 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:02.999 "is_configured": true, 00:27:02.999 "data_offset": 2048, 00:27:02.999 "data_size": 63488 00:27:02.999 }, 00:27:02.999 { 00:27:02.999 "name": "BaseBdev4", 00:27:02.999 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:02.999 "is_configured": true, 00:27:02.999 "data_offset": 2048, 00:27:02.999 "data_size": 63488 00:27:02.999 } 00:27:02.999 ] 00:27:02.999 }' 00:27:02.999 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:03.258 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:03.258 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:03.258 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:03.258 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:03.518 [2024-07-15 12:08:16.892979] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:03.518 [2024-07-15 12:08:16.897155] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1944b70 00:27:03.518 [2024-07-15 12:08:16.898679] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:03.518 12:08:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:04.457 12:08:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:04.457 12:08:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:04.457 12:08:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:04.457 12:08:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:04.457 12:08:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:04.457 12:08:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.457 12:08:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:04.718 "name": "raid_bdev1", 00:27:04.718 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:04.718 "strip_size_kb": 0, 00:27:04.718 "state": "online", 00:27:04.718 "raid_level": "raid1", 00:27:04.718 "superblock": true, 00:27:04.718 "num_base_bdevs": 4, 00:27:04.718 "num_base_bdevs_discovered": 4, 00:27:04.718 "num_base_bdevs_operational": 4, 00:27:04.718 "process": { 00:27:04.718 "type": "rebuild", 00:27:04.718 "target": "spare", 00:27:04.718 "progress": { 00:27:04.718 "blocks": 24576, 00:27:04.718 "percent": 38 00:27:04.718 } 00:27:04.718 }, 00:27:04.718 "base_bdevs_list": [ 00:27:04.718 { 00:27:04.718 "name": "spare", 00:27:04.718 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:04.718 "is_configured": true, 00:27:04.718 "data_offset": 2048, 00:27:04.718 "data_size": 63488 00:27:04.718 }, 00:27:04.718 { 00:27:04.718 "name": "BaseBdev2", 00:27:04.718 "uuid": "97f21c28-5e85-53f3-8928-fe5e70258aef", 00:27:04.718 "is_configured": true, 00:27:04.718 "data_offset": 2048, 00:27:04.718 "data_size": 63488 00:27:04.718 }, 00:27:04.718 { 00:27:04.718 "name": "BaseBdev3", 00:27:04.718 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:04.718 "is_configured": true, 00:27:04.718 "data_offset": 2048, 00:27:04.718 "data_size": 63488 00:27:04.718 }, 00:27:04.718 { 00:27:04.718 "name": "BaseBdev4", 00:27:04.718 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:04.718 "is_configured": true, 00:27:04.718 "data_offset": 2048, 00:27:04.718 "data_size": 63488 00:27:04.718 } 00:27:04.718 ] 00:27:04.718 }' 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:04.718 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:27:04.718 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:04.979 [2024-07-15 12:08:18.471227] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:05.238 [2024-07-15 12:08:18.611599] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1944b70 00:27:05.238 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:27:05.238 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:27:05.238 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:05.239 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:05.239 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:05.239 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:05.239 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:05.239 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.239 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:05.498 "name": "raid_bdev1", 00:27:05.498 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:05.498 "strip_size_kb": 0, 00:27:05.498 "state": "online", 00:27:05.498 "raid_level": "raid1", 00:27:05.498 "superblock": true, 00:27:05.498 "num_base_bdevs": 4, 00:27:05.498 "num_base_bdevs_discovered": 3, 00:27:05.498 "num_base_bdevs_operational": 3, 00:27:05.498 "process": { 00:27:05.498 "type": "rebuild", 00:27:05.498 "target": "spare", 00:27:05.498 "progress": { 00:27:05.498 "blocks": 36864, 00:27:05.498 "percent": 58 00:27:05.498 } 00:27:05.498 }, 00:27:05.498 "base_bdevs_list": [ 00:27:05.498 { 00:27:05.498 "name": "spare", 00:27:05.498 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:05.498 "is_configured": true, 00:27:05.498 "data_offset": 2048, 00:27:05.498 "data_size": 63488 00:27:05.498 }, 00:27:05.498 { 00:27:05.498 "name": null, 00:27:05.498 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.498 "is_configured": false, 00:27:05.498 "data_offset": 2048, 00:27:05.498 "data_size": 63488 00:27:05.498 }, 00:27:05.498 { 00:27:05.498 "name": "BaseBdev3", 00:27:05.498 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:05.498 "is_configured": true, 00:27:05.498 "data_offset": 2048, 00:27:05.498 "data_size": 63488 00:27:05.498 }, 00:27:05.498 { 00:27:05.498 "name": "BaseBdev4", 00:27:05.498 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:05.498 "is_configured": true, 00:27:05.498 "data_offset": 2048, 00:27:05.498 "data_size": 63488 00:27:05.498 } 00:27:05.498 ] 00:27:05.498 }' 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=932 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.498 12:08:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:05.757 12:08:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:05.757 "name": "raid_bdev1", 00:27:05.757 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:05.757 "strip_size_kb": 0, 00:27:05.757 "state": "online", 00:27:05.757 "raid_level": "raid1", 00:27:05.757 "superblock": true, 00:27:05.757 "num_base_bdevs": 4, 00:27:05.757 "num_base_bdevs_discovered": 3, 00:27:05.757 "num_base_bdevs_operational": 3, 00:27:05.757 "process": { 00:27:05.757 "type": "rebuild", 00:27:05.757 "target": "spare", 00:27:05.757 "progress": { 00:27:05.757 "blocks": 43008, 00:27:05.757 "percent": 67 00:27:05.757 } 00:27:05.757 }, 00:27:05.757 "base_bdevs_list": [ 00:27:05.757 { 00:27:05.757 "name": "spare", 00:27:05.757 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:05.757 "is_configured": true, 00:27:05.757 "data_offset": 2048, 00:27:05.757 "data_size": 63488 00:27:05.757 }, 00:27:05.757 { 00:27:05.757 "name": null, 00:27:05.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.757 "is_configured": false, 00:27:05.757 "data_offset": 2048, 00:27:05.757 "data_size": 63488 00:27:05.757 }, 00:27:05.757 { 00:27:05.757 "name": "BaseBdev3", 00:27:05.757 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:05.757 "is_configured": true, 00:27:05.757 "data_offset": 2048, 00:27:05.757 "data_size": 63488 00:27:05.757 }, 00:27:05.757 { 00:27:05.757 "name": "BaseBdev4", 00:27:05.757 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:05.757 "is_configured": true, 00:27:05.757 "data_offset": 2048, 00:27:05.757 "data_size": 63488 00:27:05.757 } 00:27:05.757 ] 00:27:05.757 }' 00:27:05.757 12:08:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:05.757 12:08:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:05.757 12:08:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:05.757 12:08:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:05.757 12:08:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:06.714 [2024-07-15 12:08:20.123464] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:06.714 [2024-07-15 12:08:20.123528] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:06.714 [2024-07-15 12:08:20.123631] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:06.714 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:06.714 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:06.714 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:06.714 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:06.714 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:06.714 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:06.974 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.974 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.974 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:06.974 "name": "raid_bdev1", 00:27:06.974 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:06.974 "strip_size_kb": 0, 00:27:06.974 "state": "online", 00:27:06.974 "raid_level": "raid1", 00:27:06.974 "superblock": true, 00:27:06.974 "num_base_bdevs": 4, 00:27:06.974 "num_base_bdevs_discovered": 3, 00:27:06.974 "num_base_bdevs_operational": 3, 00:27:06.974 "base_bdevs_list": [ 00:27:06.974 { 00:27:06.974 "name": "spare", 00:27:06.974 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:06.974 "is_configured": true, 00:27:06.974 "data_offset": 2048, 00:27:06.974 "data_size": 63488 00:27:06.974 }, 00:27:06.974 { 00:27:06.974 "name": null, 00:27:06.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:06.974 "is_configured": false, 00:27:06.974 "data_offset": 2048, 00:27:06.974 "data_size": 63488 00:27:06.974 }, 00:27:06.974 { 00:27:06.974 "name": "BaseBdev3", 00:27:06.974 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:06.974 "is_configured": true, 00:27:06.974 "data_offset": 2048, 00:27:06.974 "data_size": 63488 00:27:06.974 }, 00:27:06.974 { 00:27:06.974 "name": "BaseBdev4", 00:27:06.974 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:06.974 "is_configured": true, 00:27:06.974 "data_offset": 2048, 00:27:06.974 "data_size": 63488 00:27:06.974 } 00:27:06.974 ] 00:27:06.974 }' 00:27:06.974 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:07.234 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:07.234 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:07.234 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:07.234 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:27:07.235 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:07.235 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:07.235 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:07.235 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:07.235 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:07.235 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.235 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:07.494 "name": "raid_bdev1", 00:27:07.494 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:07.494 "strip_size_kb": 0, 00:27:07.494 "state": "online", 00:27:07.494 "raid_level": "raid1", 00:27:07.494 "superblock": true, 00:27:07.494 "num_base_bdevs": 4, 00:27:07.494 "num_base_bdevs_discovered": 3, 00:27:07.494 "num_base_bdevs_operational": 3, 00:27:07.494 "base_bdevs_list": [ 00:27:07.494 { 00:27:07.494 "name": "spare", 00:27:07.494 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:07.494 "is_configured": true, 00:27:07.494 "data_offset": 2048, 00:27:07.494 "data_size": 63488 00:27:07.494 }, 00:27:07.494 { 00:27:07.494 "name": null, 00:27:07.494 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:07.494 "is_configured": false, 00:27:07.494 "data_offset": 2048, 00:27:07.494 "data_size": 63488 00:27:07.494 }, 00:27:07.494 { 00:27:07.494 "name": "BaseBdev3", 00:27:07.494 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:07.494 "is_configured": true, 00:27:07.494 "data_offset": 2048, 00:27:07.494 "data_size": 63488 00:27:07.494 }, 00:27:07.494 { 00:27:07.494 "name": "BaseBdev4", 00:27:07.494 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:07.494 "is_configured": true, 00:27:07.494 "data_offset": 2048, 00:27:07.494 "data_size": 63488 00:27:07.494 } 00:27:07.494 ] 00:27:07.494 }' 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:07.494 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:07.495 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:07.495 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:07.495 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.495 12:08:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:07.753 12:08:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:07.753 "name": "raid_bdev1", 00:27:07.753 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:07.753 "strip_size_kb": 0, 00:27:07.753 "state": "online", 00:27:07.753 "raid_level": "raid1", 00:27:07.753 "superblock": true, 00:27:07.753 "num_base_bdevs": 4, 00:27:07.753 "num_base_bdevs_discovered": 3, 00:27:07.753 "num_base_bdevs_operational": 3, 00:27:07.753 "base_bdevs_list": [ 00:27:07.753 { 00:27:07.753 "name": "spare", 00:27:07.753 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:07.753 "is_configured": true, 00:27:07.753 "data_offset": 2048, 00:27:07.753 "data_size": 63488 00:27:07.753 }, 00:27:07.753 { 00:27:07.753 "name": null, 00:27:07.753 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:07.753 "is_configured": false, 00:27:07.753 "data_offset": 2048, 00:27:07.753 "data_size": 63488 00:27:07.753 }, 00:27:07.753 { 00:27:07.753 "name": "BaseBdev3", 00:27:07.753 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:07.753 "is_configured": true, 00:27:07.754 "data_offset": 2048, 00:27:07.754 "data_size": 63488 00:27:07.754 }, 00:27:07.754 { 00:27:07.754 "name": "BaseBdev4", 00:27:07.754 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:07.754 "is_configured": true, 00:27:07.754 "data_offset": 2048, 00:27:07.754 "data_size": 63488 00:27:07.754 } 00:27:07.754 ] 00:27:07.754 }' 00:27:07.754 12:08:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:07.754 12:08:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:08.321 12:08:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:08.888 [2024-07-15 12:08:22.297320] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:08.888 [2024-07-15 12:08:22.297350] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:08.888 [2024-07-15 12:08:22.297413] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:08.888 [2024-07-15 12:08:22.297485] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:08.888 [2024-07-15 12:08:22.297496] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18a7dc0 name raid_bdev1, state offline 00:27:08.888 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.888 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:09.147 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:09.405 /dev/nbd0 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:09.405 1+0 records in 00:27:09.405 1+0 records out 00:27:09.405 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256533 s, 16.0 MB/s 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:09.405 12:08:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:09.663 /dev/nbd1 00:27:09.663 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:09.663 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:09.663 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:09.663 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local i 00:27:09.663 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:09.663 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:09.663 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # break 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:09.664 1+0 records in 00:27:09.664 1+0 records out 00:27:09.664 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000330332 s, 12.4 MB/s 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # size=4096 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # return 0 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:09.664 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:09.922 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:10.181 12:08:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:10.439 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:10.698 [2024-07-15 12:08:24.228352] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:10.698 [2024-07-15 12:08:24.228400] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:10.698 [2024-07-15 12:08:24.228422] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1943dc0 00:27:10.698 [2024-07-15 12:08:24.228434] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:10.698 [2024-07-15 12:08:24.230077] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:10.698 [2024-07-15 12:08:24.230109] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:10.698 [2024-07-15 12:08:24.230189] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:10.698 [2024-07-15 12:08:24.230216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:10.698 [2024-07-15 12:08:24.230323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:10.698 [2024-07-15 12:08:24.230394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:10.698 spare 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.698 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:10.957 [2024-07-15 12:08:24.330710] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a524b0 00:27:10.957 [2024-07-15 12:08:24.330730] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:10.957 [2024-07-15 12:08:24.330946] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a9dd0 00:27:10.957 [2024-07-15 12:08:24.331110] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a524b0 00:27:10.957 [2024-07-15 12:08:24.331120] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1a524b0 00:27:10.957 [2024-07-15 12:08:24.331230] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:10.957 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.957 "name": "raid_bdev1", 00:27:10.957 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:10.957 "strip_size_kb": 0, 00:27:10.957 "state": "online", 00:27:10.957 "raid_level": "raid1", 00:27:10.957 "superblock": true, 00:27:10.957 "num_base_bdevs": 4, 00:27:10.957 "num_base_bdevs_discovered": 3, 00:27:10.957 "num_base_bdevs_operational": 3, 00:27:10.957 "base_bdevs_list": [ 00:27:10.957 { 00:27:10.957 "name": "spare", 00:27:10.957 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:10.957 "is_configured": true, 00:27:10.957 "data_offset": 2048, 00:27:10.957 "data_size": 63488 00:27:10.957 }, 00:27:10.957 { 00:27:10.957 "name": null, 00:27:10.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:10.957 "is_configured": false, 00:27:10.957 "data_offset": 2048, 00:27:10.957 "data_size": 63488 00:27:10.957 }, 00:27:10.957 { 00:27:10.957 "name": "BaseBdev3", 00:27:10.957 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:10.957 "is_configured": true, 00:27:10.957 "data_offset": 2048, 00:27:10.957 "data_size": 63488 00:27:10.957 }, 00:27:10.957 { 00:27:10.957 "name": "BaseBdev4", 00:27:10.957 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:10.957 "is_configured": true, 00:27:10.957 "data_offset": 2048, 00:27:10.957 "data_size": 63488 00:27:10.957 } 00:27:10.957 ] 00:27:10.957 }' 00:27:10.957 12:08:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.957 12:08:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:11.524 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:11.524 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:11.524 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:11.524 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:11.524 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:11.524 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.524 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:11.784 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:11.784 "name": "raid_bdev1", 00:27:11.784 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:11.784 "strip_size_kb": 0, 00:27:11.784 "state": "online", 00:27:11.784 "raid_level": "raid1", 00:27:11.784 "superblock": true, 00:27:11.784 "num_base_bdevs": 4, 00:27:11.784 "num_base_bdevs_discovered": 3, 00:27:11.784 "num_base_bdevs_operational": 3, 00:27:11.784 "base_bdevs_list": [ 00:27:11.784 { 00:27:11.784 "name": "spare", 00:27:11.784 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:11.784 "is_configured": true, 00:27:11.784 "data_offset": 2048, 00:27:11.784 "data_size": 63488 00:27:11.784 }, 00:27:11.784 { 00:27:11.784 "name": null, 00:27:11.784 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:11.784 "is_configured": false, 00:27:11.784 "data_offset": 2048, 00:27:11.784 "data_size": 63488 00:27:11.784 }, 00:27:11.784 { 00:27:11.784 "name": "BaseBdev3", 00:27:11.784 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:11.784 "is_configured": true, 00:27:11.784 "data_offset": 2048, 00:27:11.784 "data_size": 63488 00:27:11.784 }, 00:27:11.784 { 00:27:11.784 "name": "BaseBdev4", 00:27:11.784 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:11.784 "is_configured": true, 00:27:11.784 "data_offset": 2048, 00:27:11.784 "data_size": 63488 00:27:11.784 } 00:27:11.784 ] 00:27:11.784 }' 00:27:11.784 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:12.043 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:12.043 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:12.043 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:12.043 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.043 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:12.302 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:12.302 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:12.562 [2024-07-15 12:08:25.921075] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.562 12:08:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.822 12:08:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:12.822 "name": "raid_bdev1", 00:27:12.822 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:12.822 "strip_size_kb": 0, 00:27:12.822 "state": "online", 00:27:12.822 "raid_level": "raid1", 00:27:12.822 "superblock": true, 00:27:12.822 "num_base_bdevs": 4, 00:27:12.822 "num_base_bdevs_discovered": 2, 00:27:12.822 "num_base_bdevs_operational": 2, 00:27:12.822 "base_bdevs_list": [ 00:27:12.822 { 00:27:12.822 "name": null, 00:27:12.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.822 "is_configured": false, 00:27:12.822 "data_offset": 2048, 00:27:12.822 "data_size": 63488 00:27:12.822 }, 00:27:12.822 { 00:27:12.822 "name": null, 00:27:12.822 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:12.822 "is_configured": false, 00:27:12.822 "data_offset": 2048, 00:27:12.822 "data_size": 63488 00:27:12.822 }, 00:27:12.822 { 00:27:12.822 "name": "BaseBdev3", 00:27:12.822 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:12.822 "is_configured": true, 00:27:12.822 "data_offset": 2048, 00:27:12.822 "data_size": 63488 00:27:12.822 }, 00:27:12.822 { 00:27:12.822 "name": "BaseBdev4", 00:27:12.822 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:12.822 "is_configured": true, 00:27:12.822 "data_offset": 2048, 00:27:12.822 "data_size": 63488 00:27:12.822 } 00:27:12.822 ] 00:27:12.822 }' 00:27:12.822 12:08:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:12.822 12:08:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:13.390 12:08:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:13.649 [2024-07-15 12:08:27.011978] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:13.650 [2024-07-15 12:08:27.012131] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:13.650 [2024-07-15 12:08:27.012149] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:13.650 [2024-07-15 12:08:27.012178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:13.650 [2024-07-15 12:08:27.016115] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15b5230 00:27:13.650 [2024-07-15 12:08:27.017465] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:13.650 12:08:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:14.586 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:14.586 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:14.586 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:14.586 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:14.586 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:14.586 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.586 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:14.845 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:14.845 "name": "raid_bdev1", 00:27:14.845 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:14.845 "strip_size_kb": 0, 00:27:14.845 "state": "online", 00:27:14.845 "raid_level": "raid1", 00:27:14.845 "superblock": true, 00:27:14.845 "num_base_bdevs": 4, 00:27:14.845 "num_base_bdevs_discovered": 3, 00:27:14.845 "num_base_bdevs_operational": 3, 00:27:14.845 "process": { 00:27:14.845 "type": "rebuild", 00:27:14.845 "target": "spare", 00:27:14.845 "progress": { 00:27:14.845 "blocks": 24576, 00:27:14.845 "percent": 38 00:27:14.845 } 00:27:14.845 }, 00:27:14.845 "base_bdevs_list": [ 00:27:14.845 { 00:27:14.845 "name": "spare", 00:27:14.845 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:14.845 "is_configured": true, 00:27:14.845 "data_offset": 2048, 00:27:14.845 "data_size": 63488 00:27:14.845 }, 00:27:14.845 { 00:27:14.845 "name": null, 00:27:14.845 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:14.845 "is_configured": false, 00:27:14.845 "data_offset": 2048, 00:27:14.845 "data_size": 63488 00:27:14.845 }, 00:27:14.845 { 00:27:14.845 "name": "BaseBdev3", 00:27:14.845 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:14.845 "is_configured": true, 00:27:14.845 "data_offset": 2048, 00:27:14.845 "data_size": 63488 00:27:14.845 }, 00:27:14.845 { 00:27:14.845 "name": "BaseBdev4", 00:27:14.845 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:14.845 "is_configured": true, 00:27:14.845 "data_offset": 2048, 00:27:14.845 "data_size": 63488 00:27:14.845 } 00:27:14.845 ] 00:27:14.845 }' 00:27:14.845 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:14.845 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:14.845 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:14.845 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:14.845 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:15.104 [2024-07-15 12:08:28.617641] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:15.104 [2024-07-15 12:08:28.629611] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:15.104 [2024-07-15 12:08:28.629656] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:15.104 [2024-07-15 12:08:28.629673] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:15.104 [2024-07-15 12:08:28.629681] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.104 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:15.363 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:15.363 "name": "raid_bdev1", 00:27:15.363 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:15.363 "strip_size_kb": 0, 00:27:15.363 "state": "online", 00:27:15.363 "raid_level": "raid1", 00:27:15.363 "superblock": true, 00:27:15.363 "num_base_bdevs": 4, 00:27:15.363 "num_base_bdevs_discovered": 2, 00:27:15.363 "num_base_bdevs_operational": 2, 00:27:15.363 "base_bdevs_list": [ 00:27:15.363 { 00:27:15.363 "name": null, 00:27:15.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.363 "is_configured": false, 00:27:15.363 "data_offset": 2048, 00:27:15.363 "data_size": 63488 00:27:15.363 }, 00:27:15.363 { 00:27:15.363 "name": null, 00:27:15.363 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:15.363 "is_configured": false, 00:27:15.363 "data_offset": 2048, 00:27:15.363 "data_size": 63488 00:27:15.363 }, 00:27:15.363 { 00:27:15.363 "name": "BaseBdev3", 00:27:15.363 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:15.363 "is_configured": true, 00:27:15.363 "data_offset": 2048, 00:27:15.363 "data_size": 63488 00:27:15.363 }, 00:27:15.363 { 00:27:15.363 "name": "BaseBdev4", 00:27:15.363 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:15.363 "is_configured": true, 00:27:15.363 "data_offset": 2048, 00:27:15.363 "data_size": 63488 00:27:15.363 } 00:27:15.363 ] 00:27:15.363 }' 00:27:15.363 12:08:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:15.363 12:08:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:16.299 12:08:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:16.557 [2024-07-15 12:08:29.921504] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:16.557 [2024-07-15 12:08:29.921557] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:16.557 [2024-07-15 12:08:29.921583] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18a9160 00:27:16.558 [2024-07-15 12:08:29.921597] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:16.558 [2024-07-15 12:08:29.921995] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:16.558 [2024-07-15 12:08:29.922016] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:16.558 [2024-07-15 12:08:29.922099] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:16.558 [2024-07-15 12:08:29.922112] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:27:16.558 [2024-07-15 12:08:29.922123] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:16.558 [2024-07-15 12:08:29.922141] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:16.558 [2024-07-15 12:08:29.926111] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a93f0 00:27:16.558 spare 00:27:16.558 [2024-07-15 12:08:29.927466] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:16.558 12:08:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:17.495 12:08:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:17.495 12:08:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:17.495 12:08:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:17.495 12:08:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:17.495 12:08:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:17.495 12:08:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.495 12:08:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:17.755 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:17.755 "name": "raid_bdev1", 00:27:17.755 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:17.755 "strip_size_kb": 0, 00:27:17.755 "state": "online", 00:27:17.755 "raid_level": "raid1", 00:27:17.755 "superblock": true, 00:27:17.755 "num_base_bdevs": 4, 00:27:17.755 "num_base_bdevs_discovered": 3, 00:27:17.755 "num_base_bdevs_operational": 3, 00:27:17.755 "process": { 00:27:17.755 "type": "rebuild", 00:27:17.755 "target": "spare", 00:27:17.755 "progress": { 00:27:17.755 "blocks": 24576, 00:27:17.755 "percent": 38 00:27:17.755 } 00:27:17.755 }, 00:27:17.755 "base_bdevs_list": [ 00:27:17.755 { 00:27:17.755 "name": "spare", 00:27:17.755 "uuid": "35093a4f-46e5-54e5-b158-b19cedb96a68", 00:27:17.755 "is_configured": true, 00:27:17.755 "data_offset": 2048, 00:27:17.755 "data_size": 63488 00:27:17.755 }, 00:27:17.755 { 00:27:17.755 "name": null, 00:27:17.755 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:17.755 "is_configured": false, 00:27:17.755 "data_offset": 2048, 00:27:17.755 "data_size": 63488 00:27:17.755 }, 00:27:17.755 { 00:27:17.755 "name": "BaseBdev3", 00:27:17.755 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:17.755 "is_configured": true, 00:27:17.755 "data_offset": 2048, 00:27:17.755 "data_size": 63488 00:27:17.755 }, 00:27:17.755 { 00:27:17.755 "name": "BaseBdev4", 00:27:17.755 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:17.755 "is_configured": true, 00:27:17.755 "data_offset": 2048, 00:27:17.755 "data_size": 63488 00:27:17.755 } 00:27:17.755 ] 00:27:17.755 }' 00:27:17.755 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:17.755 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:17.755 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:17.755 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:17.755 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:18.037 [2024-07-15 12:08:31.515653] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:18.037 [2024-07-15 12:08:31.540095] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:18.037 [2024-07-15 12:08:31.540136] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:18.037 [2024-07-15 12:08:31.540152] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:18.037 [2024-07-15 12:08:31.540161] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.037 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.296 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:18.296 "name": "raid_bdev1", 00:27:18.296 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:18.296 "strip_size_kb": 0, 00:27:18.296 "state": "online", 00:27:18.296 "raid_level": "raid1", 00:27:18.296 "superblock": true, 00:27:18.296 "num_base_bdevs": 4, 00:27:18.296 "num_base_bdevs_discovered": 2, 00:27:18.296 "num_base_bdevs_operational": 2, 00:27:18.296 "base_bdevs_list": [ 00:27:18.296 { 00:27:18.296 "name": null, 00:27:18.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.296 "is_configured": false, 00:27:18.296 "data_offset": 2048, 00:27:18.296 "data_size": 63488 00:27:18.296 }, 00:27:18.296 { 00:27:18.296 "name": null, 00:27:18.296 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.296 "is_configured": false, 00:27:18.296 "data_offset": 2048, 00:27:18.296 "data_size": 63488 00:27:18.296 }, 00:27:18.296 { 00:27:18.296 "name": "BaseBdev3", 00:27:18.296 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:18.296 "is_configured": true, 00:27:18.296 "data_offset": 2048, 00:27:18.296 "data_size": 63488 00:27:18.296 }, 00:27:18.296 { 00:27:18.296 "name": "BaseBdev4", 00:27:18.296 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:18.296 "is_configured": true, 00:27:18.296 "data_offset": 2048, 00:27:18.296 "data_size": 63488 00:27:18.296 } 00:27:18.296 ] 00:27:18.296 }' 00:27:18.296 12:08:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:18.296 12:08:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:18.963 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:18.963 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:18.963 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:18.963 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:18.963 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:18.963 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.963 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:19.222 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:19.222 "name": "raid_bdev1", 00:27:19.222 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:19.222 "strip_size_kb": 0, 00:27:19.222 "state": "online", 00:27:19.222 "raid_level": "raid1", 00:27:19.222 "superblock": true, 00:27:19.222 "num_base_bdevs": 4, 00:27:19.222 "num_base_bdevs_discovered": 2, 00:27:19.222 "num_base_bdevs_operational": 2, 00:27:19.222 "base_bdevs_list": [ 00:27:19.222 { 00:27:19.222 "name": null, 00:27:19.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.222 "is_configured": false, 00:27:19.222 "data_offset": 2048, 00:27:19.222 "data_size": 63488 00:27:19.222 }, 00:27:19.222 { 00:27:19.222 "name": null, 00:27:19.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:19.222 "is_configured": false, 00:27:19.222 "data_offset": 2048, 00:27:19.222 "data_size": 63488 00:27:19.222 }, 00:27:19.222 { 00:27:19.222 "name": "BaseBdev3", 00:27:19.222 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:19.222 "is_configured": true, 00:27:19.222 "data_offset": 2048, 00:27:19.222 "data_size": 63488 00:27:19.222 }, 00:27:19.222 { 00:27:19.222 "name": "BaseBdev4", 00:27:19.222 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:19.222 "is_configured": true, 00:27:19.222 "data_offset": 2048, 00:27:19.222 "data_size": 63488 00:27:19.222 } 00:27:19.222 ] 00:27:19.222 }' 00:27:19.222 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:19.222 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:19.222 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:19.222 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:19.222 12:08:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:19.481 12:08:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:19.740 [2024-07-15 12:08:33.268559] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:19.740 [2024-07-15 12:08:33.268611] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:19.740 [2024-07-15 12:08:33.268632] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18adbf0 00:27:19.740 [2024-07-15 12:08:33.268644] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:19.740 [2024-07-15 12:08:33.269020] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:19.740 [2024-07-15 12:08:33.269041] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:19.740 [2024-07-15 12:08:33.269111] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:19.740 [2024-07-15 12:08:33.269130] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:19.740 [2024-07-15 12:08:33.269141] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:19.740 BaseBdev1 00:27:19.740 12:08:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.116 "name": "raid_bdev1", 00:27:21.116 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:21.116 "strip_size_kb": 0, 00:27:21.116 "state": "online", 00:27:21.116 "raid_level": "raid1", 00:27:21.116 "superblock": true, 00:27:21.116 "num_base_bdevs": 4, 00:27:21.116 "num_base_bdevs_discovered": 2, 00:27:21.116 "num_base_bdevs_operational": 2, 00:27:21.116 "base_bdevs_list": [ 00:27:21.116 { 00:27:21.116 "name": null, 00:27:21.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.116 "is_configured": false, 00:27:21.116 "data_offset": 2048, 00:27:21.116 "data_size": 63488 00:27:21.116 }, 00:27:21.116 { 00:27:21.116 "name": null, 00:27:21.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.116 "is_configured": false, 00:27:21.116 "data_offset": 2048, 00:27:21.116 "data_size": 63488 00:27:21.116 }, 00:27:21.116 { 00:27:21.116 "name": "BaseBdev3", 00:27:21.116 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:21.116 "is_configured": true, 00:27:21.116 "data_offset": 2048, 00:27:21.116 "data_size": 63488 00:27:21.116 }, 00:27:21.116 { 00:27:21.116 "name": "BaseBdev4", 00:27:21.116 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:21.116 "is_configured": true, 00:27:21.116 "data_offset": 2048, 00:27:21.116 "data_size": 63488 00:27:21.116 } 00:27:21.116 ] 00:27:21.116 }' 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.116 12:08:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:21.682 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:21.682 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:21.682 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:21.682 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:21.682 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:21.682 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.682 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:21.941 "name": "raid_bdev1", 00:27:21.941 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:21.941 "strip_size_kb": 0, 00:27:21.941 "state": "online", 00:27:21.941 "raid_level": "raid1", 00:27:21.941 "superblock": true, 00:27:21.941 "num_base_bdevs": 4, 00:27:21.941 "num_base_bdevs_discovered": 2, 00:27:21.941 "num_base_bdevs_operational": 2, 00:27:21.941 "base_bdevs_list": [ 00:27:21.941 { 00:27:21.941 "name": null, 00:27:21.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.941 "is_configured": false, 00:27:21.941 "data_offset": 2048, 00:27:21.941 "data_size": 63488 00:27:21.941 }, 00:27:21.941 { 00:27:21.941 "name": null, 00:27:21.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.941 "is_configured": false, 00:27:21.941 "data_offset": 2048, 00:27:21.941 "data_size": 63488 00:27:21.941 }, 00:27:21.941 { 00:27:21.941 "name": "BaseBdev3", 00:27:21.941 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:21.941 "is_configured": true, 00:27:21.941 "data_offset": 2048, 00:27:21.941 "data_size": 63488 00:27:21.941 }, 00:27:21.941 { 00:27:21.941 "name": "BaseBdev4", 00:27:21.941 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:21.941 "is_configured": true, 00:27:21.941 "data_offset": 2048, 00:27:21.941 "data_size": 63488 00:27:21.941 } 00:27:21.941 ] 00:27:21.941 }' 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@648 -- # local es=0 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:21.941 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:22.200 [2024-07-15 12:08:35.783373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:22.200 [2024-07-15 12:08:35.783514] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:27:22.200 [2024-07-15 12:08:35.783531] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:22.200 request: 00:27:22.200 { 00:27:22.200 "base_bdev": "BaseBdev1", 00:27:22.200 "raid_bdev": "raid_bdev1", 00:27:22.200 "method": "bdev_raid_add_base_bdev", 00:27:22.200 "req_id": 1 00:27:22.200 } 00:27:22.200 Got JSON-RPC error response 00:27:22.200 response: 00:27:22.200 { 00:27:22.200 "code": -22, 00:27:22.200 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:22.200 } 00:27:22.458 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # es=1 00:27:22.458 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:27:22.458 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:27:22.458 12:08:35 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:27:22.458 12:08:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.393 12:08:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.652 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.652 "name": "raid_bdev1", 00:27:23.652 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:23.652 "strip_size_kb": 0, 00:27:23.652 "state": "online", 00:27:23.652 "raid_level": "raid1", 00:27:23.652 "superblock": true, 00:27:23.652 "num_base_bdevs": 4, 00:27:23.652 "num_base_bdevs_discovered": 2, 00:27:23.652 "num_base_bdevs_operational": 2, 00:27:23.652 "base_bdevs_list": [ 00:27:23.652 { 00:27:23.652 "name": null, 00:27:23.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.652 "is_configured": false, 00:27:23.652 "data_offset": 2048, 00:27:23.652 "data_size": 63488 00:27:23.652 }, 00:27:23.652 { 00:27:23.652 "name": null, 00:27:23.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.652 "is_configured": false, 00:27:23.652 "data_offset": 2048, 00:27:23.652 "data_size": 63488 00:27:23.652 }, 00:27:23.652 { 00:27:23.652 "name": "BaseBdev3", 00:27:23.652 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:23.652 "is_configured": true, 00:27:23.652 "data_offset": 2048, 00:27:23.652 "data_size": 63488 00:27:23.652 }, 00:27:23.652 { 00:27:23.652 "name": "BaseBdev4", 00:27:23.652 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:23.652 "is_configured": true, 00:27:23.652 "data_offset": 2048, 00:27:23.652 "data_size": 63488 00:27:23.652 } 00:27:23.652 ] 00:27:23.652 }' 00:27:23.652 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.652 12:08:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:24.219 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:24.219 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:24.219 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:24.219 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:24.219 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:24.219 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.219 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:24.478 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:24.478 "name": "raid_bdev1", 00:27:24.478 "uuid": "763734d3-9a8f-47f3-b7e1-7453d532dfae", 00:27:24.478 "strip_size_kb": 0, 00:27:24.478 "state": "online", 00:27:24.478 "raid_level": "raid1", 00:27:24.478 "superblock": true, 00:27:24.478 "num_base_bdevs": 4, 00:27:24.478 "num_base_bdevs_discovered": 2, 00:27:24.478 "num_base_bdevs_operational": 2, 00:27:24.478 "base_bdevs_list": [ 00:27:24.478 { 00:27:24.478 "name": null, 00:27:24.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.478 "is_configured": false, 00:27:24.478 "data_offset": 2048, 00:27:24.478 "data_size": 63488 00:27:24.478 }, 00:27:24.478 { 00:27:24.478 "name": null, 00:27:24.478 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.478 "is_configured": false, 00:27:24.478 "data_offset": 2048, 00:27:24.478 "data_size": 63488 00:27:24.478 }, 00:27:24.478 { 00:27:24.478 "name": "BaseBdev3", 00:27:24.478 "uuid": "2639a265-cede-588d-855f-f39dda18087b", 00:27:24.478 "is_configured": true, 00:27:24.478 "data_offset": 2048, 00:27:24.478 "data_size": 63488 00:27:24.478 }, 00:27:24.478 { 00:27:24.478 "name": "BaseBdev4", 00:27:24.478 "uuid": "64046208-8bdf-5962-b77e-4ca70654b6f2", 00:27:24.478 "is_configured": true, 00:27:24.478 "data_offset": 2048, 00:27:24.478 "data_size": 63488 00:27:24.478 } 00:27:24.478 ] 00:27:24.478 }' 00:27:24.478 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:24.478 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:24.478 12:08:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 1583502 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@948 -- # '[' -z 1583502 ']' 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@952 -- # kill -0 1583502 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # uname 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1583502 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1583502' 00:27:24.478 killing process with pid 1583502 00:27:24.478 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # kill 1583502 00:27:24.478 Received shutdown signal, test time was about 60.000000 seconds 00:27:24.478 00:27:24.478 Latency(us) 00:27:24.479 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:24.479 =================================================================================================================== 00:27:24.479 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:24.479 [2024-07-15 12:08:38.054175] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:24.479 [2024-07-15 12:08:38.054276] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:24.479 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@972 -- # wait 1583502 00:27:24.479 [2024-07-15 12:08:38.054337] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:24.479 [2024-07-15 12:08:38.054350] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a524b0 name raid_bdev1, state offline 00:27:24.737 [2024-07-15 12:08:38.108211] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:24.737 12:08:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:27:24.737 00:27:24.737 real 0m39.356s 00:27:24.737 user 0m56.824s 00:27:24.737 sys 0m7.412s 00:27:24.737 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:24.737 12:08:38 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:27:24.737 ************************************ 00:27:24.737 END TEST raid_rebuild_test_sb 00:27:24.737 ************************************ 00:27:24.996 12:08:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:24.996 12:08:38 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:27:24.996 12:08:38 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:24.996 12:08:38 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:24.996 12:08:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:24.996 ************************************ 00:27:24.996 START TEST raid_rebuild_test_io 00:27:24.996 ************************************ 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 false true true 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:24.996 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1588978 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1588978 /var/tmp/spdk-raid.sock 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@829 -- # '[' -z 1588978 ']' 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:24.997 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:24.997 12:08:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:24.997 [2024-07-15 12:08:38.485700] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:27:24.997 [2024-07-15 12:08:38.485766] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1588978 ] 00:27:24.997 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:24.997 Zero copy mechanism will not be used. 00:27:25.255 [2024-07-15 12:08:38.614936] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:25.255 [2024-07-15 12:08:38.716614] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:25.255 [2024-07-15 12:08:38.779704] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:25.255 [2024-07-15 12:08:38.779748] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:25.823 12:08:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:25.823 12:08:39 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@862 -- # return 0 00:27:25.823 12:08:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:25.823 12:08:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:26.081 BaseBdev1_malloc 00:27:26.081 12:08:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:26.340 [2024-07-15 12:08:39.888505] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:26.340 [2024-07-15 12:08:39.888555] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:26.340 [2024-07-15 12:08:39.888579] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e169c0 00:27:26.340 [2024-07-15 12:08:39.888592] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:26.340 [2024-07-15 12:08:39.890338] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:26.340 [2024-07-15 12:08:39.890369] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:26.340 BaseBdev1 00:27:26.340 12:08:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:26.340 12:08:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:26.599 BaseBdev2_malloc 00:27:26.599 12:08:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:27.166 [2024-07-15 12:08:40.640406] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:27.166 [2024-07-15 12:08:40.640455] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:27.167 [2024-07-15 12:08:40.640478] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e17510 00:27:27.167 [2024-07-15 12:08:40.640491] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:27.167 [2024-07-15 12:08:40.642041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:27.167 [2024-07-15 12:08:40.642072] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:27.167 BaseBdev2 00:27:27.167 12:08:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:27.167 12:08:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:27.424 BaseBdev3_malloc 00:27:27.425 12:08:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:27:27.682 [2024-07-15 12:08:41.138298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:27:27.682 [2024-07-15 12:08:41.138342] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:27.682 [2024-07-15 12:08:41.138362] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fbf310 00:27:27.682 [2024-07-15 12:08:41.138374] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:27.682 [2024-07-15 12:08:41.139883] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:27.682 [2024-07-15 12:08:41.139918] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:27.682 BaseBdev3 00:27:27.682 12:08:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:27.682 12:08:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:27.940 BaseBdev4_malloc 00:27:27.940 12:08:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:27:28.198 [2024-07-15 12:08:41.632157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:27:28.198 [2024-07-15 12:08:41.632206] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:28.198 [2024-07-15 12:08:41.632226] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fbf7f0 00:27:28.198 [2024-07-15 12:08:41.632238] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:28.198 [2024-07-15 12:08:41.633806] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:28.198 [2024-07-15 12:08:41.633835] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:28.198 BaseBdev4 00:27:28.198 12:08:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:28.457 spare_malloc 00:27:28.457 12:08:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:28.716 spare_delay 00:27:28.716 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:28.975 [2024-07-15 12:08:42.367844] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:28.975 [2024-07-15 12:08:42.367888] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:28.975 [2024-07-15 12:08:42.367912] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e0f700 00:27:28.975 [2024-07-15 12:08:42.367924] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:28.975 [2024-07-15 12:08:42.369519] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:28.975 [2024-07-15 12:08:42.369548] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:28.975 spare 00:27:28.975 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:29.233 [2024-07-15 12:08:42.612513] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:29.233 [2024-07-15 12:08:42.613898] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:29.233 [2024-07-15 12:08:42.613953] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:29.233 [2024-07-15 12:08:42.613999] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:29.233 [2024-07-15 12:08:42.614081] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e10dc0 00:27:29.233 [2024-07-15 12:08:42.614092] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:27:29.233 [2024-07-15 12:08:42.614310] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e14f70 00:27:29.233 [2024-07-15 12:08:42.614463] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e10dc0 00:27:29.233 [2024-07-15 12:08:42.614473] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e10dc0 00:27:29.233 [2024-07-15 12:08:42.614597] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.233 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.491 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:29.491 "name": "raid_bdev1", 00:27:29.491 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:29.491 "strip_size_kb": 0, 00:27:29.491 "state": "online", 00:27:29.491 "raid_level": "raid1", 00:27:29.491 "superblock": false, 00:27:29.491 "num_base_bdevs": 4, 00:27:29.491 "num_base_bdevs_discovered": 4, 00:27:29.491 "num_base_bdevs_operational": 4, 00:27:29.491 "base_bdevs_list": [ 00:27:29.491 { 00:27:29.491 "name": "BaseBdev1", 00:27:29.491 "uuid": "bcd08eff-f40b-50b5-9510-75219de05489", 00:27:29.491 "is_configured": true, 00:27:29.491 "data_offset": 0, 00:27:29.491 "data_size": 65536 00:27:29.491 }, 00:27:29.491 { 00:27:29.491 "name": "BaseBdev2", 00:27:29.491 "uuid": "77ed0944-6691-5b35-a8d2-68dea41bd95a", 00:27:29.491 "is_configured": true, 00:27:29.491 "data_offset": 0, 00:27:29.491 "data_size": 65536 00:27:29.491 }, 00:27:29.491 { 00:27:29.491 "name": "BaseBdev3", 00:27:29.491 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:29.491 "is_configured": true, 00:27:29.491 "data_offset": 0, 00:27:29.491 "data_size": 65536 00:27:29.491 }, 00:27:29.491 { 00:27:29.491 "name": "BaseBdev4", 00:27:29.491 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:29.492 "is_configured": true, 00:27:29.492 "data_offset": 0, 00:27:29.492 "data_size": 65536 00:27:29.492 } 00:27:29.492 ] 00:27:29.492 }' 00:27:29.492 12:08:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:29.492 12:08:42 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:30.057 12:08:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:30.057 12:08:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:30.315 [2024-07-15 12:08:43.820192] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:30.315 12:08:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:27:30.315 12:08:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.315 12:08:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:30.574 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:27:30.574 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:27:30.574 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:30.574 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:30.833 [2024-07-15 12:08:44.203242] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e15600 00:27:30.833 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:30.833 Zero copy mechanism will not be used. 00:27:30.833 Running I/O for 60 seconds... 00:27:30.833 [2024-07-15 12:08:44.372169] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:30.833 [2024-07-15 12:08:44.388327] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1e15600 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.833 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.401 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:31.401 "name": "raid_bdev1", 00:27:31.401 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:31.401 "strip_size_kb": 0, 00:27:31.401 "state": "online", 00:27:31.401 "raid_level": "raid1", 00:27:31.401 "superblock": false, 00:27:31.401 "num_base_bdevs": 4, 00:27:31.401 "num_base_bdevs_discovered": 3, 00:27:31.401 "num_base_bdevs_operational": 3, 00:27:31.401 "base_bdevs_list": [ 00:27:31.401 { 00:27:31.401 "name": null, 00:27:31.401 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.401 "is_configured": false, 00:27:31.401 "data_offset": 0, 00:27:31.401 "data_size": 65536 00:27:31.401 }, 00:27:31.401 { 00:27:31.401 "name": "BaseBdev2", 00:27:31.401 "uuid": "77ed0944-6691-5b35-a8d2-68dea41bd95a", 00:27:31.401 "is_configured": true, 00:27:31.401 "data_offset": 0, 00:27:31.401 "data_size": 65536 00:27:31.401 }, 00:27:31.401 { 00:27:31.401 "name": "BaseBdev3", 00:27:31.401 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:31.401 "is_configured": true, 00:27:31.401 "data_offset": 0, 00:27:31.401 "data_size": 65536 00:27:31.401 }, 00:27:31.401 { 00:27:31.401 "name": "BaseBdev4", 00:27:31.401 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:31.401 "is_configured": true, 00:27:31.401 "data_offset": 0, 00:27:31.401 "data_size": 65536 00:27:31.401 } 00:27:31.401 ] 00:27:31.401 }' 00:27:31.401 12:08:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:31.401 12:08:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:31.967 12:08:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:32.226 [2024-07-15 12:08:45.709411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:32.226 12:08:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:32.226 [2024-07-15 12:08:45.801947] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e13c70 00:27:32.226 [2024-07-15 12:08:45.804324] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:32.486 [2024-07-15 12:08:45.916164] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:32.486 [2024-07-15 12:08:45.916506] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:32.486 [2024-07-15 12:08:46.019726] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:32.486 [2024-07-15 12:08:46.019994] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:33.054 [2024-07-15 12:08:46.357832] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:33.054 [2024-07-15 12:08:46.581997] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:33.054 [2024-07-15 12:08:46.582655] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:33.312 12:08:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:33.312 12:08:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:33.312 12:08:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:33.312 12:08:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:33.312 12:08:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:33.312 12:08:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.312 12:08:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:33.571 [2024-07-15 12:08:46.947527] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:33.571 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:33.571 "name": "raid_bdev1", 00:27:33.571 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:33.571 "strip_size_kb": 0, 00:27:33.571 "state": "online", 00:27:33.571 "raid_level": "raid1", 00:27:33.571 "superblock": false, 00:27:33.571 "num_base_bdevs": 4, 00:27:33.571 "num_base_bdevs_discovered": 4, 00:27:33.571 "num_base_bdevs_operational": 4, 00:27:33.571 "process": { 00:27:33.571 "type": "rebuild", 00:27:33.571 "target": "spare", 00:27:33.571 "progress": { 00:27:33.571 "blocks": 14336, 00:27:33.571 "percent": 21 00:27:33.571 } 00:27:33.571 }, 00:27:33.571 "base_bdevs_list": [ 00:27:33.571 { 00:27:33.571 "name": "spare", 00:27:33.571 "uuid": "a9ec86a6-cc83-5f4a-b003-0545e82ce839", 00:27:33.571 "is_configured": true, 00:27:33.571 "data_offset": 0, 00:27:33.571 "data_size": 65536 00:27:33.571 }, 00:27:33.571 { 00:27:33.571 "name": "BaseBdev2", 00:27:33.571 "uuid": "77ed0944-6691-5b35-a8d2-68dea41bd95a", 00:27:33.571 "is_configured": true, 00:27:33.571 "data_offset": 0, 00:27:33.571 "data_size": 65536 00:27:33.571 }, 00:27:33.571 { 00:27:33.571 "name": "BaseBdev3", 00:27:33.571 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:33.571 "is_configured": true, 00:27:33.571 "data_offset": 0, 00:27:33.571 "data_size": 65536 00:27:33.571 }, 00:27:33.571 { 00:27:33.571 "name": "BaseBdev4", 00:27:33.571 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:33.571 "is_configured": true, 00:27:33.571 "data_offset": 0, 00:27:33.571 "data_size": 65536 00:27:33.571 } 00:27:33.572 ] 00:27:33.572 }' 00:27:33.572 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:33.572 [2024-07-15 12:08:47.069056] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:33.572 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:33.572 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:33.572 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:33.572 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:33.830 [2024-07-15 12:08:47.350633] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:34.089 [2024-07-15 12:08:47.550835] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:34.089 [2024-07-15 12:08:47.570699] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:34.089 [2024-07-15 12:08:47.570745] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:34.089 [2024-07-15 12:08:47.570756] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:34.089 [2024-07-15 12:08:47.576379] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1e15600 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.089 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:34.348 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:34.348 "name": "raid_bdev1", 00:27:34.348 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:34.348 "strip_size_kb": 0, 00:27:34.348 "state": "online", 00:27:34.348 "raid_level": "raid1", 00:27:34.348 "superblock": false, 00:27:34.348 "num_base_bdevs": 4, 00:27:34.348 "num_base_bdevs_discovered": 3, 00:27:34.348 "num_base_bdevs_operational": 3, 00:27:34.348 "base_bdevs_list": [ 00:27:34.348 { 00:27:34.348 "name": null, 00:27:34.348 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:34.348 "is_configured": false, 00:27:34.348 "data_offset": 0, 00:27:34.348 "data_size": 65536 00:27:34.348 }, 00:27:34.348 { 00:27:34.348 "name": "BaseBdev2", 00:27:34.348 "uuid": "77ed0944-6691-5b35-a8d2-68dea41bd95a", 00:27:34.348 "is_configured": true, 00:27:34.348 "data_offset": 0, 00:27:34.348 "data_size": 65536 00:27:34.348 }, 00:27:34.348 { 00:27:34.348 "name": "BaseBdev3", 00:27:34.348 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:34.349 "is_configured": true, 00:27:34.349 "data_offset": 0, 00:27:34.349 "data_size": 65536 00:27:34.349 }, 00:27:34.349 { 00:27:34.349 "name": "BaseBdev4", 00:27:34.349 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:34.349 "is_configured": true, 00:27:34.349 "data_offset": 0, 00:27:34.349 "data_size": 65536 00:27:34.349 } 00:27:34.349 ] 00:27:34.349 }' 00:27:34.349 12:08:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:34.349 12:08:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:35.289 "name": "raid_bdev1", 00:27:35.289 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:35.289 "strip_size_kb": 0, 00:27:35.289 "state": "online", 00:27:35.289 "raid_level": "raid1", 00:27:35.289 "superblock": false, 00:27:35.289 "num_base_bdevs": 4, 00:27:35.289 "num_base_bdevs_discovered": 3, 00:27:35.289 "num_base_bdevs_operational": 3, 00:27:35.289 "base_bdevs_list": [ 00:27:35.289 { 00:27:35.289 "name": null, 00:27:35.289 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:35.289 "is_configured": false, 00:27:35.289 "data_offset": 0, 00:27:35.289 "data_size": 65536 00:27:35.289 }, 00:27:35.289 { 00:27:35.289 "name": "BaseBdev2", 00:27:35.289 "uuid": "77ed0944-6691-5b35-a8d2-68dea41bd95a", 00:27:35.289 "is_configured": true, 00:27:35.289 "data_offset": 0, 00:27:35.289 "data_size": 65536 00:27:35.289 }, 00:27:35.289 { 00:27:35.289 "name": "BaseBdev3", 00:27:35.289 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:35.289 "is_configured": true, 00:27:35.289 "data_offset": 0, 00:27:35.289 "data_size": 65536 00:27:35.289 }, 00:27:35.289 { 00:27:35.289 "name": "BaseBdev4", 00:27:35.289 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:35.289 "is_configured": true, 00:27:35.289 "data_offset": 0, 00:27:35.289 "data_size": 65536 00:27:35.289 } 00:27:35.289 ] 00:27:35.289 }' 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:35.289 12:08:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:35.548 [2024-07-15 12:08:49.059026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:35.548 [2024-07-15 12:08:49.096088] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e41570 00:27:35.548 [2024-07-15 12:08:49.097596] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:35.548 12:08:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:35.806 [2024-07-15 12:08:49.219091] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:35.806 [2024-07-15 12:08:49.219539] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:35.806 [2024-07-15 12:08:49.330311] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:35.806 [2024-07-15 12:08:49.330882] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:36.371 [2024-07-15 12:08:49.684050] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:36.371 [2024-07-15 12:08:49.684375] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:36.371 [2024-07-15 12:08:49.826860] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:36.630 [2024-07-15 12:08:50.106802] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:36.630 [2024-07-15 12:08:50.108088] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:36.630 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:36.630 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:36.630 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:36.630 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:36.630 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:36.630 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.630 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:36.889 [2024-07-15 12:08:50.318960] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:36.889 [2024-07-15 12:08:50.319629] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:37.148 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:37.148 "name": "raid_bdev1", 00:27:37.148 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:37.148 "strip_size_kb": 0, 00:27:37.148 "state": "online", 00:27:37.148 "raid_level": "raid1", 00:27:37.148 "superblock": false, 00:27:37.148 "num_base_bdevs": 4, 00:27:37.148 "num_base_bdevs_discovered": 4, 00:27:37.148 "num_base_bdevs_operational": 4, 00:27:37.148 "process": { 00:27:37.148 "type": "rebuild", 00:27:37.148 "target": "spare", 00:27:37.148 "progress": { 00:27:37.148 "blocks": 18432, 00:27:37.148 "percent": 28 00:27:37.148 } 00:27:37.148 }, 00:27:37.148 "base_bdevs_list": [ 00:27:37.148 { 00:27:37.148 "name": "spare", 00:27:37.148 "uuid": "a9ec86a6-cc83-5f4a-b003-0545e82ce839", 00:27:37.148 "is_configured": true, 00:27:37.148 "data_offset": 0, 00:27:37.148 "data_size": 65536 00:27:37.148 }, 00:27:37.148 { 00:27:37.148 "name": "BaseBdev2", 00:27:37.148 "uuid": "77ed0944-6691-5b35-a8d2-68dea41bd95a", 00:27:37.148 "is_configured": true, 00:27:37.148 "data_offset": 0, 00:27:37.148 "data_size": 65536 00:27:37.148 }, 00:27:37.148 { 00:27:37.148 "name": "BaseBdev3", 00:27:37.148 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:37.148 "is_configured": true, 00:27:37.148 "data_offset": 0, 00:27:37.148 "data_size": 65536 00:27:37.148 }, 00:27:37.148 { 00:27:37.148 "name": "BaseBdev4", 00:27:37.148 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:37.148 "is_configured": true, 00:27:37.148 "data_offset": 0, 00:27:37.148 "data_size": 65536 00:27:37.148 } 00:27:37.148 ] 00:27:37.148 }' 00:27:37.148 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:37.148 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:37.148 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:37.407 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:37.407 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:27:37.407 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:27:37.407 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:37.407 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:27:37.407 12:08:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:37.666 [2024-07-15 12:08:51.009813] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:27:37.666 [2024-07-15 12:08:51.063007] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:37.666 [2024-07-15 12:08:51.260029] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:27:37.666 [2024-07-15 12:08:51.260968] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1e15600 00:27:37.666 [2024-07-15 12:08:51.260992] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1e41570 00:27:37.925 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:27:37.925 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:27:37.925 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:37.925 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:37.925 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:37.925 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:37.925 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:37.925 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.925 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:38.493 "name": "raid_bdev1", 00:27:38.493 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:38.493 "strip_size_kb": 0, 00:27:38.493 "state": "online", 00:27:38.493 "raid_level": "raid1", 00:27:38.493 "superblock": false, 00:27:38.493 "num_base_bdevs": 4, 00:27:38.493 "num_base_bdevs_discovered": 3, 00:27:38.493 "num_base_bdevs_operational": 3, 00:27:38.493 "process": { 00:27:38.493 "type": "rebuild", 00:27:38.493 "target": "spare", 00:27:38.493 "progress": { 00:27:38.493 "blocks": 36864, 00:27:38.493 "percent": 56 00:27:38.493 } 00:27:38.493 }, 00:27:38.493 "base_bdevs_list": [ 00:27:38.493 { 00:27:38.493 "name": "spare", 00:27:38.493 "uuid": "a9ec86a6-cc83-5f4a-b003-0545e82ce839", 00:27:38.493 "is_configured": true, 00:27:38.493 "data_offset": 0, 00:27:38.493 "data_size": 65536 00:27:38.493 }, 00:27:38.493 { 00:27:38.493 "name": null, 00:27:38.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:38.493 "is_configured": false, 00:27:38.493 "data_offset": 0, 00:27:38.493 "data_size": 65536 00:27:38.493 }, 00:27:38.493 { 00:27:38.493 "name": "BaseBdev3", 00:27:38.493 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:38.493 "is_configured": true, 00:27:38.493 "data_offset": 0, 00:27:38.493 "data_size": 65536 00:27:38.493 }, 00:27:38.493 { 00:27:38.493 "name": "BaseBdev4", 00:27:38.493 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:38.493 "is_configured": true, 00:27:38.493 "data_offset": 0, 00:27:38.493 "data_size": 65536 00:27:38.493 } 00:27:38.493 ] 00:27:38.493 }' 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:38.493 [2024-07-15 12:08:51.924817] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=965 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.493 12:08:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:39.059 12:08:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:39.059 "name": "raid_bdev1", 00:27:39.059 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:39.059 "strip_size_kb": 0, 00:27:39.059 "state": "online", 00:27:39.059 "raid_level": "raid1", 00:27:39.059 "superblock": false, 00:27:39.059 "num_base_bdevs": 4, 00:27:39.059 "num_base_bdevs_discovered": 3, 00:27:39.059 "num_base_bdevs_operational": 3, 00:27:39.059 "process": { 00:27:39.059 "type": "rebuild", 00:27:39.059 "target": "spare", 00:27:39.059 "progress": { 00:27:39.059 "blocks": 49152, 00:27:39.059 "percent": 75 00:27:39.059 } 00:27:39.059 }, 00:27:39.059 "base_bdevs_list": [ 00:27:39.059 { 00:27:39.059 "name": "spare", 00:27:39.059 "uuid": "a9ec86a6-cc83-5f4a-b003-0545e82ce839", 00:27:39.059 "is_configured": true, 00:27:39.059 "data_offset": 0, 00:27:39.059 "data_size": 65536 00:27:39.059 }, 00:27:39.059 { 00:27:39.059 "name": null, 00:27:39.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:39.059 "is_configured": false, 00:27:39.059 "data_offset": 0, 00:27:39.059 "data_size": 65536 00:27:39.059 }, 00:27:39.059 { 00:27:39.059 "name": "BaseBdev3", 00:27:39.059 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:39.059 "is_configured": true, 00:27:39.059 "data_offset": 0, 00:27:39.059 "data_size": 65536 00:27:39.059 }, 00:27:39.059 { 00:27:39.059 "name": "BaseBdev4", 00:27:39.059 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:39.059 "is_configured": true, 00:27:39.059 "data_offset": 0, 00:27:39.059 "data_size": 65536 00:27:39.059 } 00:27:39.059 ] 00:27:39.059 }' 00:27:39.059 12:08:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:39.059 12:08:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:39.059 12:08:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:39.059 12:08:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:39.059 12:08:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:40.002 [2024-07-15 12:08:53.260583] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:40.002 [2024-07-15 12:08:53.360823] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:40.002 [2024-07-15 12:08:53.371128] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:40.002 12:08:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:40.002 12:08:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:40.002 12:08:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:40.002 12:08:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:40.002 12:08:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:40.002 12:08:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:40.002 12:08:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.002 12:08:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.577 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:40.577 "name": "raid_bdev1", 00:27:40.577 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:40.577 "strip_size_kb": 0, 00:27:40.577 "state": "online", 00:27:40.577 "raid_level": "raid1", 00:27:40.577 "superblock": false, 00:27:40.577 "num_base_bdevs": 4, 00:27:40.577 "num_base_bdevs_discovered": 3, 00:27:40.577 "num_base_bdevs_operational": 3, 00:27:40.577 "base_bdevs_list": [ 00:27:40.577 { 00:27:40.577 "name": "spare", 00:27:40.577 "uuid": "a9ec86a6-cc83-5f4a-b003-0545e82ce839", 00:27:40.577 "is_configured": true, 00:27:40.577 "data_offset": 0, 00:27:40.577 "data_size": 65536 00:27:40.577 }, 00:27:40.577 { 00:27:40.577 "name": null, 00:27:40.577 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:40.577 "is_configured": false, 00:27:40.577 "data_offset": 0, 00:27:40.577 "data_size": 65536 00:27:40.577 }, 00:27:40.577 { 00:27:40.577 "name": "BaseBdev3", 00:27:40.577 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:40.577 "is_configured": true, 00:27:40.577 "data_offset": 0, 00:27:40.577 "data_size": 65536 00:27:40.577 }, 00:27:40.577 { 00:27:40.577 "name": "BaseBdev4", 00:27:40.577 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:40.577 "is_configured": true, 00:27:40.577 "data_offset": 0, 00:27:40.577 "data_size": 65536 00:27:40.577 } 00:27:40.577 ] 00:27:40.577 }' 00:27:40.577 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.868 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.155 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:41.155 "name": "raid_bdev1", 00:27:41.155 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:41.155 "strip_size_kb": 0, 00:27:41.155 "state": "online", 00:27:41.155 "raid_level": "raid1", 00:27:41.155 "superblock": false, 00:27:41.155 "num_base_bdevs": 4, 00:27:41.155 "num_base_bdevs_discovered": 3, 00:27:41.155 "num_base_bdevs_operational": 3, 00:27:41.155 "base_bdevs_list": [ 00:27:41.155 { 00:27:41.155 "name": "spare", 00:27:41.155 "uuid": "a9ec86a6-cc83-5f4a-b003-0545e82ce839", 00:27:41.155 "is_configured": true, 00:27:41.155 "data_offset": 0, 00:27:41.155 "data_size": 65536 00:27:41.155 }, 00:27:41.155 { 00:27:41.155 "name": null, 00:27:41.155 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:41.155 "is_configured": false, 00:27:41.155 "data_offset": 0, 00:27:41.155 "data_size": 65536 00:27:41.155 }, 00:27:41.155 { 00:27:41.155 "name": "BaseBdev3", 00:27:41.155 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:41.155 "is_configured": true, 00:27:41.155 "data_offset": 0, 00:27:41.155 "data_size": 65536 00:27:41.155 }, 00:27:41.155 { 00:27:41.155 "name": "BaseBdev4", 00:27:41.155 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:41.155 "is_configured": true, 00:27:41.155 "data_offset": 0, 00:27:41.155 "data_size": 65536 00:27:41.155 } 00:27:41.155 ] 00:27:41.155 }' 00:27:41.155 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.413 12:08:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.672 12:08:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:41.672 "name": "raid_bdev1", 00:27:41.672 "uuid": "f24e1ddb-4add-4d65-b0ab-9896efd808f3", 00:27:41.672 "strip_size_kb": 0, 00:27:41.672 "state": "online", 00:27:41.672 "raid_level": "raid1", 00:27:41.672 "superblock": false, 00:27:41.672 "num_base_bdevs": 4, 00:27:41.672 "num_base_bdevs_discovered": 3, 00:27:41.672 "num_base_bdevs_operational": 3, 00:27:41.672 "base_bdevs_list": [ 00:27:41.672 { 00:27:41.672 "name": "spare", 00:27:41.672 "uuid": "a9ec86a6-cc83-5f4a-b003-0545e82ce839", 00:27:41.672 "is_configured": true, 00:27:41.672 "data_offset": 0, 00:27:41.672 "data_size": 65536 00:27:41.672 }, 00:27:41.672 { 00:27:41.672 "name": null, 00:27:41.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:41.673 "is_configured": false, 00:27:41.673 "data_offset": 0, 00:27:41.673 "data_size": 65536 00:27:41.673 }, 00:27:41.673 { 00:27:41.673 "name": "BaseBdev3", 00:27:41.673 "uuid": "5b53ad4b-9d86-5271-a054-c0645ff388e8", 00:27:41.673 "is_configured": true, 00:27:41.673 "data_offset": 0, 00:27:41.673 "data_size": 65536 00:27:41.673 }, 00:27:41.673 { 00:27:41.673 "name": "BaseBdev4", 00:27:41.673 "uuid": "08bd34ec-487f-5903-bdc2-9fc4d9e7409c", 00:27:41.673 "is_configured": true, 00:27:41.673 "data_offset": 0, 00:27:41.673 "data_size": 65536 00:27:41.673 } 00:27:41.673 ] 00:27:41.673 }' 00:27:41.673 12:08:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:41.673 12:08:55 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:42.240 12:08:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:42.498 [2024-07-15 12:08:55.976038] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:42.498 [2024-07-15 12:08:55.976072] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:42.498 00:27:42.498 Latency(us) 00:27:42.498 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:42.498 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:27:42.498 raid_bdev1 : 11.79 91.18 273.54 0.00 0.00 15199.56 290.28 123093.70 00:27:42.498 =================================================================================================================== 00:27:42.498 Total : 91.18 273.54 0.00 0.00 15199.56 290.28 123093.70 00:27:42.498 [2024-07-15 12:08:56.028110] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:42.498 [2024-07-15 12:08:56.028145] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:42.499 [2024-07-15 12:08:56.028239] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:42.499 [2024-07-15 12:08:56.028251] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e10dc0 name raid_bdev1, state offline 00:27:42.499 0 00:27:42.499 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.499 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:42.758 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:27:43.016 /dev/nbd0 00:27:43.016 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:43.016 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:43.016 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:43.017 1+0 records in 00:27:43.017 1+0 records out 00:27:43.017 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000268653 s, 15.2 MB/s 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:43.017 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:27:43.276 /dev/nbd1 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:43.276 1+0 records in 00:27:43.276 1+0 records out 00:27:43.276 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287671 s, 14.2 MB/s 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:43.276 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:43.534 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:43.534 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:43.534 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:43.535 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:43.535 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:43.535 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:43.535 12:08:56 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:43.793 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:43.793 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:43.793 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:43.793 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:43.794 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:27:44.128 /dev/nbd1 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local i 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # break 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:44.128 1+0 records in 00:27:44.128 1+0 records out 00:27:44.128 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032547 s, 12.6 MB/s 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # size=4096 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # return 0 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:44.128 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:44.386 12:08:57 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 1588978 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@948 -- # '[' -z 1588978 ']' 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@952 -- # kill -0 1588978 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # uname 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1588978 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1588978' 00:27:44.645 killing process with pid 1588978 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # kill 1588978 00:27:44.645 Received shutdown signal, test time was about 13.874124 seconds 00:27:44.645 00:27:44.645 Latency(us) 00:27:44.645 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:44.645 =================================================================================================================== 00:27:44.645 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:27:44.645 [2024-07-15 12:08:58.113525] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:44.645 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@972 -- # wait 1588978 00:27:44.645 [2024-07-15 12:08:58.154000] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:44.904 12:08:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:27:44.904 00:27:44.904 real 0m19.960s 00:27:44.904 user 0m32.175s 00:27:44.904 sys 0m3.628s 00:27:44.904 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:27:44.904 12:08:58 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:27:44.904 ************************************ 00:27:44.904 END TEST raid_rebuild_test_io 00:27:44.904 ************************************ 00:27:44.904 12:08:58 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:27:44.904 12:08:58 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:27:44.904 12:08:58 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:27:44.904 12:08:58 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:27:44.904 12:08:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:44.904 ************************************ 00:27:44.904 START TEST raid_rebuild_test_sb_io 00:27:44.904 ************************************ 00:27:44.904 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 4 true true true 00:27:44.904 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:44.904 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:27:44.904 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:44.904 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=1591753 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 1591753 /var/tmp/spdk-raid.sock 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@829 -- # '[' -z 1591753 ']' 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local max_retries=100 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:44.905 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # xtrace_disable 00:27:44.905 12:08:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:45.164 [2024-07-15 12:08:58.531144] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:27:45.164 [2024-07-15 12:08:58.531199] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1591753 ] 00:27:45.164 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:45.164 Zero copy mechanism will not be used. 00:27:45.165 [2024-07-15 12:08:58.643690] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:45.165 [2024-07-15 12:08:58.744876] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:45.425 [2024-07-15 12:08:58.808215] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:45.425 [2024-07-15 12:08:58.808273] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:45.992 12:08:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:27:45.992 12:08:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@862 -- # return 0 00:27:45.992 12:08:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:45.992 12:08:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:27:46.249 BaseBdev1_malloc 00:27:46.249 12:08:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:46.507 [2024-07-15 12:08:59.878029] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:46.507 [2024-07-15 12:08:59.878076] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:46.507 [2024-07-15 12:08:59.878100] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e39c0 00:27:46.507 [2024-07-15 12:08:59.878112] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:46.507 [2024-07-15 12:08:59.879649] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:46.507 [2024-07-15 12:08:59.879678] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:46.507 BaseBdev1 00:27:46.507 12:08:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:46.507 12:08:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:27:47.072 BaseBdev2_malloc 00:27:47.072 12:09:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:47.331 [2024-07-15 12:09:00.730298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:47.331 [2024-07-15 12:09:00.730348] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:47.331 [2024-07-15 12:09:00.730373] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e4510 00:27:47.331 [2024-07-15 12:09:00.730385] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:47.331 [2024-07-15 12:09:00.731863] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:47.331 [2024-07-15 12:09:00.731892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:47.331 BaseBdev2 00:27:47.331 12:09:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:47.331 12:09:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:27:47.589 BaseBdev3_malloc 00:27:47.589 12:09:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:27:47.847 [2024-07-15 12:09:01.236190] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:27:47.847 [2024-07-15 12:09:01.236239] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:47.847 [2024-07-15 12:09:01.236261] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x278c310 00:27:47.847 [2024-07-15 12:09:01.236274] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:47.847 [2024-07-15 12:09:01.237825] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:47.847 [2024-07-15 12:09:01.237856] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:27:47.847 BaseBdev3 00:27:47.847 12:09:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:47.847 12:09:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:27:48.105 BaseBdev4_malloc 00:27:48.105 12:09:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:27:48.363 [2024-07-15 12:09:01.718085] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:27:48.363 [2024-07-15 12:09:01.718138] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:48.363 [2024-07-15 12:09:01.718161] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x278c7f0 00:27:48.363 [2024-07-15 12:09:01.718174] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:48.364 [2024-07-15 12:09:01.719729] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:48.364 [2024-07-15 12:09:01.719758] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:27:48.364 BaseBdev4 00:27:48.364 12:09:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:27:48.622 spare_malloc 00:27:48.622 12:09:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:48.622 spare_delay 00:27:48.881 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:48.881 [2024-07-15 12:09:02.461154] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:48.881 [2024-07-15 12:09:02.461197] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:48.881 [2024-07-15 12:09:02.461222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25dc700 00:27:48.881 [2024-07-15 12:09:02.461234] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:48.881 [2024-07-15 12:09:02.462659] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:48.881 [2024-07-15 12:09:02.462697] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:48.881 spare 00:27:49.139 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:27:49.140 [2024-07-15 12:09:02.713863] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:49.140 [2024-07-15 12:09:02.715246] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:49.140 [2024-07-15 12:09:02.715303] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:27:49.140 [2024-07-15 12:09:02.715349] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:27:49.140 [2024-07-15 12:09:02.715548] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25dddc0 00:27:49.140 [2024-07-15 12:09:02.715560] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:27:49.140 [2024-07-15 12:09:02.715781] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2788b50 00:27:49.140 [2024-07-15 12:09:02.715936] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25dddc0 00:27:49.140 [2024-07-15 12:09:02.715947] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25dddc0 00:27:49.140 [2024-07-15 12:09:02.716049] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:49.140 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:27:49.140 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:49.140 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:49.140 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:49.140 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:49.140 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:27:49.140 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:49.398 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:49.398 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:49.398 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:49.398 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.398 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.398 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:49.398 "name": "raid_bdev1", 00:27:49.398 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:27:49.398 "strip_size_kb": 0, 00:27:49.398 "state": "online", 00:27:49.398 "raid_level": "raid1", 00:27:49.398 "superblock": true, 00:27:49.398 "num_base_bdevs": 4, 00:27:49.398 "num_base_bdevs_discovered": 4, 00:27:49.398 "num_base_bdevs_operational": 4, 00:27:49.398 "base_bdevs_list": [ 00:27:49.398 { 00:27:49.398 "name": "BaseBdev1", 00:27:49.398 "uuid": "d9dfac0f-64c4-5dd1-ae8c-e2101d49158d", 00:27:49.398 "is_configured": true, 00:27:49.398 "data_offset": 2048, 00:27:49.398 "data_size": 63488 00:27:49.398 }, 00:27:49.398 { 00:27:49.398 "name": "BaseBdev2", 00:27:49.398 "uuid": "d9a8de03-eb20-5c50-8947-5cafc5c6a472", 00:27:49.398 "is_configured": true, 00:27:49.398 "data_offset": 2048, 00:27:49.398 "data_size": 63488 00:27:49.398 }, 00:27:49.398 { 00:27:49.398 "name": "BaseBdev3", 00:27:49.398 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:27:49.398 "is_configured": true, 00:27:49.398 "data_offset": 2048, 00:27:49.399 "data_size": 63488 00:27:49.399 }, 00:27:49.399 { 00:27:49.399 "name": "BaseBdev4", 00:27:49.399 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:27:49.399 "is_configured": true, 00:27:49.399 "data_offset": 2048, 00:27:49.399 "data_size": 63488 00:27:49.399 } 00:27:49.399 ] 00:27:49.399 }' 00:27:49.399 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:49.399 12:09:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:49.964 12:09:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:49.964 12:09:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:50.222 [2024-07-15 12:09:03.752877] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:50.222 12:09:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:27:50.222 12:09:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.222 12:09:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:50.479 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:27:50.479 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:27:50.479 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:50.479 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:27:50.738 [2024-07-15 12:09:04.119615] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2787fe0 00:27:50.738 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:50.738 Zero copy mechanism will not be used. 00:27:50.738 Running I/O for 60 seconds... 00:27:50.738 [2024-07-15 12:09:04.271963] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:50.738 [2024-07-15 12:09:04.280217] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2787fe0 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.738 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.996 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.996 "name": "raid_bdev1", 00:27:50.996 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:27:50.996 "strip_size_kb": 0, 00:27:50.996 "state": "online", 00:27:50.996 "raid_level": "raid1", 00:27:50.996 "superblock": true, 00:27:50.996 "num_base_bdevs": 4, 00:27:50.996 "num_base_bdevs_discovered": 3, 00:27:50.997 "num_base_bdevs_operational": 3, 00:27:50.997 "base_bdevs_list": [ 00:27:50.997 { 00:27:50.997 "name": null, 00:27:50.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.997 "is_configured": false, 00:27:50.997 "data_offset": 2048, 00:27:50.997 "data_size": 63488 00:27:50.997 }, 00:27:50.997 { 00:27:50.997 "name": "BaseBdev2", 00:27:50.997 "uuid": "d9a8de03-eb20-5c50-8947-5cafc5c6a472", 00:27:50.997 "is_configured": true, 00:27:50.997 "data_offset": 2048, 00:27:50.997 "data_size": 63488 00:27:50.997 }, 00:27:50.997 { 00:27:50.997 "name": "BaseBdev3", 00:27:50.997 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:27:50.997 "is_configured": true, 00:27:50.997 "data_offset": 2048, 00:27:50.997 "data_size": 63488 00:27:50.997 }, 00:27:50.997 { 00:27:50.997 "name": "BaseBdev4", 00:27:50.997 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:27:50.997 "is_configured": true, 00:27:50.997 "data_offset": 2048, 00:27:50.997 "data_size": 63488 00:27:50.997 } 00:27:50.997 ] 00:27:50.997 }' 00:27:50.997 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.997 12:09:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:51.934 12:09:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:51.934 [2024-07-15 12:09:05.439857] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:51.934 12:09:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:51.934 [2024-07-15 12:09:05.497561] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25e1290 00:27:51.934 [2024-07-15 12:09:05.500002] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:52.194 [2024-07-15 12:09:05.620812] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:52.194 [2024-07-15 12:09:05.622090] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:52.453 [2024-07-15 12:09:05.846750] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:52.453 [2024-07-15 12:09:05.846932] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:52.712 [2024-07-15 12:09:06.185123] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:52.971 [2024-07-15 12:09:06.352161] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:52.971 [2024-07-15 12:09:06.352797] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:52.971 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:52.971 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:52.971 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:52.971 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:52.971 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:52.971 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:52.971 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:53.231 [2024-07-15 12:09:06.727222] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:27:53.231 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:53.231 "name": "raid_bdev1", 00:27:53.231 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:27:53.231 "strip_size_kb": 0, 00:27:53.231 "state": "online", 00:27:53.231 "raid_level": "raid1", 00:27:53.231 "superblock": true, 00:27:53.231 "num_base_bdevs": 4, 00:27:53.231 "num_base_bdevs_discovered": 4, 00:27:53.231 "num_base_bdevs_operational": 4, 00:27:53.231 "process": { 00:27:53.231 "type": "rebuild", 00:27:53.231 "target": "spare", 00:27:53.231 "progress": { 00:27:53.231 "blocks": 12288, 00:27:53.231 "percent": 19 00:27:53.231 } 00:27:53.231 }, 00:27:53.231 "base_bdevs_list": [ 00:27:53.231 { 00:27:53.231 "name": "spare", 00:27:53.231 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:27:53.231 "is_configured": true, 00:27:53.231 "data_offset": 2048, 00:27:53.231 "data_size": 63488 00:27:53.231 }, 00:27:53.231 { 00:27:53.231 "name": "BaseBdev2", 00:27:53.231 "uuid": "d9a8de03-eb20-5c50-8947-5cafc5c6a472", 00:27:53.231 "is_configured": true, 00:27:53.231 "data_offset": 2048, 00:27:53.231 "data_size": 63488 00:27:53.231 }, 00:27:53.231 { 00:27:53.231 "name": "BaseBdev3", 00:27:53.231 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:27:53.231 "is_configured": true, 00:27:53.231 "data_offset": 2048, 00:27:53.231 "data_size": 63488 00:27:53.231 }, 00:27:53.231 { 00:27:53.231 "name": "BaseBdev4", 00:27:53.231 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:27:53.231 "is_configured": true, 00:27:53.231 "data_offset": 2048, 00:27:53.231 "data_size": 63488 00:27:53.231 } 00:27:53.231 ] 00:27:53.231 }' 00:27:53.231 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:53.231 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:53.231 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:53.490 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:53.490 12:09:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:53.490 [2024-07-15 12:09:06.968993] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:53.490 [2024-07-15 12:09:06.969603] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:53.490 [2024-07-15 12:09:07.068585] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:53.749 [2024-07-15 12:09:07.119023] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:53.749 [2024-07-15 12:09:07.229531] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:53.749 [2024-07-15 12:09:07.232593] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:53.749 [2024-07-15 12:09:07.232621] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:53.749 [2024-07-15 12:09:07.232630] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:53.749 [2024-07-15 12:09:07.256039] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2787fe0 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.749 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.007 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.007 "name": "raid_bdev1", 00:27:54.007 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:27:54.007 "strip_size_kb": 0, 00:27:54.007 "state": "online", 00:27:54.007 "raid_level": "raid1", 00:27:54.007 "superblock": true, 00:27:54.007 "num_base_bdevs": 4, 00:27:54.007 "num_base_bdevs_discovered": 3, 00:27:54.007 "num_base_bdevs_operational": 3, 00:27:54.007 "base_bdevs_list": [ 00:27:54.007 { 00:27:54.007 "name": null, 00:27:54.007 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.007 "is_configured": false, 00:27:54.007 "data_offset": 2048, 00:27:54.007 "data_size": 63488 00:27:54.007 }, 00:27:54.007 { 00:27:54.007 "name": "BaseBdev2", 00:27:54.007 "uuid": "d9a8de03-eb20-5c50-8947-5cafc5c6a472", 00:27:54.007 "is_configured": true, 00:27:54.007 "data_offset": 2048, 00:27:54.007 "data_size": 63488 00:27:54.007 }, 00:27:54.007 { 00:27:54.007 "name": "BaseBdev3", 00:27:54.007 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:27:54.007 "is_configured": true, 00:27:54.007 "data_offset": 2048, 00:27:54.007 "data_size": 63488 00:27:54.007 }, 00:27:54.007 { 00:27:54.007 "name": "BaseBdev4", 00:27:54.007 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:27:54.007 "is_configured": true, 00:27:54.007 "data_offset": 2048, 00:27:54.007 "data_size": 63488 00:27:54.007 } 00:27:54.007 ] 00:27:54.007 }' 00:27:54.008 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.008 12:09:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:54.943 "name": "raid_bdev1", 00:27:54.943 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:27:54.943 "strip_size_kb": 0, 00:27:54.943 "state": "online", 00:27:54.943 "raid_level": "raid1", 00:27:54.943 "superblock": true, 00:27:54.943 "num_base_bdevs": 4, 00:27:54.943 "num_base_bdevs_discovered": 3, 00:27:54.943 "num_base_bdevs_operational": 3, 00:27:54.943 "base_bdevs_list": [ 00:27:54.943 { 00:27:54.943 "name": null, 00:27:54.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.943 "is_configured": false, 00:27:54.943 "data_offset": 2048, 00:27:54.943 "data_size": 63488 00:27:54.943 }, 00:27:54.943 { 00:27:54.943 "name": "BaseBdev2", 00:27:54.943 "uuid": "d9a8de03-eb20-5c50-8947-5cafc5c6a472", 00:27:54.943 "is_configured": true, 00:27:54.943 "data_offset": 2048, 00:27:54.943 "data_size": 63488 00:27:54.943 }, 00:27:54.943 { 00:27:54.943 "name": "BaseBdev3", 00:27:54.943 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:27:54.943 "is_configured": true, 00:27:54.943 "data_offset": 2048, 00:27:54.943 "data_size": 63488 00:27:54.943 }, 00:27:54.943 { 00:27:54.943 "name": "BaseBdev4", 00:27:54.943 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:27:54.943 "is_configured": true, 00:27:54.943 "data_offset": 2048, 00:27:54.943 "data_size": 63488 00:27:54.943 } 00:27:54.943 ] 00:27:54.943 }' 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:54.943 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:55.202 [2024-07-15 12:09:08.764051] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:55.461 12:09:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:55.461 [2024-07-15 12:09:08.839123] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x260e430 00:27:55.461 [2024-07-15 12:09:08.840697] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:55.461 [2024-07-15 12:09:08.972555] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:55.461 [2024-07-15 12:09:08.973140] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:27:55.721 [2024-07-15 12:09:09.134829] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:27:55.979 [2024-07-15 12:09:09.389517] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:27:56.239 [2024-07-15 12:09:09.631867] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:27:56.239 12:09:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:56.239 12:09:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:56.239 12:09:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:56.239 12:09:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:56.239 12:09:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:56.499 12:09:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.499 12:09:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.499 [2024-07-15 12:09:09.987464] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:56.499 [2024-07-15 12:09:09.987650] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:27:56.499 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:56.499 "name": "raid_bdev1", 00:27:56.499 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:27:56.499 "strip_size_kb": 0, 00:27:56.499 "state": "online", 00:27:56.499 "raid_level": "raid1", 00:27:56.499 "superblock": true, 00:27:56.499 "num_base_bdevs": 4, 00:27:56.499 "num_base_bdevs_discovered": 4, 00:27:56.499 "num_base_bdevs_operational": 4, 00:27:56.499 "process": { 00:27:56.499 "type": "rebuild", 00:27:56.499 "target": "spare", 00:27:56.499 "progress": { 00:27:56.499 "blocks": 16384, 00:27:56.499 "percent": 25 00:27:56.499 } 00:27:56.499 }, 00:27:56.499 "base_bdevs_list": [ 00:27:56.499 { 00:27:56.499 "name": "spare", 00:27:56.499 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:27:56.499 "is_configured": true, 00:27:56.499 "data_offset": 2048, 00:27:56.499 "data_size": 63488 00:27:56.499 }, 00:27:56.499 { 00:27:56.499 "name": "BaseBdev2", 00:27:56.499 "uuid": "d9a8de03-eb20-5c50-8947-5cafc5c6a472", 00:27:56.499 "is_configured": true, 00:27:56.499 "data_offset": 2048, 00:27:56.499 "data_size": 63488 00:27:56.499 }, 00:27:56.499 { 00:27:56.499 "name": "BaseBdev3", 00:27:56.499 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:27:56.499 "is_configured": true, 00:27:56.499 "data_offset": 2048, 00:27:56.499 "data_size": 63488 00:27:56.499 }, 00:27:56.499 { 00:27:56.499 "name": "BaseBdev4", 00:27:56.499 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:27:56.499 "is_configured": true, 00:27:56.499 "data_offset": 2048, 00:27:56.499 "data_size": 63488 00:27:56.499 } 00:27:56.499 ] 00:27:56.499 }' 00:27:56.499 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:56.758 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:56.758 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:56.758 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:56.758 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:56.758 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:56.758 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:56.758 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:27:56.758 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:56.758 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:27:56.758 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:27:56.758 [2024-07-15 12:09:10.315445] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:27:57.017 [2024-07-15 12:09:10.403761] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:57.017 [2024-07-15 12:09:10.555276] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:27:57.276 [2024-07-15 12:09:10.776706] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2787fe0 00:27:57.276 [2024-07-15 12:09:10.776742] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x260e430 00:27:57.276 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:27:57.276 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:27:57.276 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:57.276 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:57.276 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:57.276 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:57.276 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:57.276 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.276 12:09:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.535 [2024-07-15 12:09:11.027496] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:27:57.535 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:57.535 "name": "raid_bdev1", 00:27:57.535 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:27:57.535 "strip_size_kb": 0, 00:27:57.535 "state": "online", 00:27:57.535 "raid_level": "raid1", 00:27:57.535 "superblock": true, 00:27:57.535 "num_base_bdevs": 4, 00:27:57.535 "num_base_bdevs_discovered": 3, 00:27:57.535 "num_base_bdevs_operational": 3, 00:27:57.535 "process": { 00:27:57.535 "type": "rebuild", 00:27:57.535 "target": "spare", 00:27:57.535 "progress": { 00:27:57.535 "blocks": 26624, 00:27:57.535 "percent": 41 00:27:57.535 } 00:27:57.535 }, 00:27:57.535 "base_bdevs_list": [ 00:27:57.535 { 00:27:57.535 "name": "spare", 00:27:57.535 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:27:57.535 "is_configured": true, 00:27:57.535 "data_offset": 2048, 00:27:57.535 "data_size": 63488 00:27:57.535 }, 00:27:57.535 { 00:27:57.535 "name": null, 00:27:57.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:57.535 "is_configured": false, 00:27:57.535 "data_offset": 2048, 00:27:57.535 "data_size": 63488 00:27:57.535 }, 00:27:57.535 { 00:27:57.535 "name": "BaseBdev3", 00:27:57.535 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:27:57.535 "is_configured": true, 00:27:57.535 "data_offset": 2048, 00:27:57.535 "data_size": 63488 00:27:57.535 }, 00:27:57.535 { 00:27:57.535 "name": "BaseBdev4", 00:27:57.535 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:27:57.535 "is_configured": true, 00:27:57.535 "data_offset": 2048, 00:27:57.535 "data_size": 63488 00:27:57.535 } 00:27:57.535 ] 00:27:57.535 }' 00:27:57.535 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:57.794 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:57.795 [2024-07-15 12:09:11.175184] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=985 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.795 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:58.053 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:58.053 "name": "raid_bdev1", 00:27:58.053 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:27:58.053 "strip_size_kb": 0, 00:27:58.053 "state": "online", 00:27:58.053 "raid_level": "raid1", 00:27:58.053 "superblock": true, 00:27:58.053 "num_base_bdevs": 4, 00:27:58.053 "num_base_bdevs_discovered": 3, 00:27:58.053 "num_base_bdevs_operational": 3, 00:27:58.053 "process": { 00:27:58.053 "type": "rebuild", 00:27:58.053 "target": "spare", 00:27:58.053 "progress": { 00:27:58.053 "blocks": 28672, 00:27:58.053 "percent": 45 00:27:58.053 } 00:27:58.053 }, 00:27:58.053 "base_bdevs_list": [ 00:27:58.053 { 00:27:58.053 "name": "spare", 00:27:58.053 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:27:58.053 "is_configured": true, 00:27:58.053 "data_offset": 2048, 00:27:58.053 "data_size": 63488 00:27:58.053 }, 00:27:58.053 { 00:27:58.053 "name": null, 00:27:58.053 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:58.053 "is_configured": false, 00:27:58.053 "data_offset": 2048, 00:27:58.053 "data_size": 63488 00:27:58.053 }, 00:27:58.053 { 00:27:58.053 "name": "BaseBdev3", 00:27:58.053 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:27:58.053 "is_configured": true, 00:27:58.053 "data_offset": 2048, 00:27:58.053 "data_size": 63488 00:27:58.053 }, 00:27:58.053 { 00:27:58.053 "name": "BaseBdev4", 00:27:58.053 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:27:58.053 "is_configured": true, 00:27:58.053 "data_offset": 2048, 00:27:58.053 "data_size": 63488 00:27:58.053 } 00:27:58.053 ] 00:27:58.053 }' 00:27:58.053 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:58.053 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:58.053 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:58.053 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:58.053 12:09:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:58.312 [2024-07-15 12:09:11.857626] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:27:58.880 [2024-07-15 12:09:12.446807] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:27:59.150 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:59.150 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:59.150 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:59.150 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:59.150 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:59.150 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:59.150 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.150 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:59.410 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:59.410 "name": "raid_bdev1", 00:27:59.410 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:27:59.410 "strip_size_kb": 0, 00:27:59.410 "state": "online", 00:27:59.410 "raid_level": "raid1", 00:27:59.410 "superblock": true, 00:27:59.410 "num_base_bdevs": 4, 00:27:59.410 "num_base_bdevs_discovered": 3, 00:27:59.410 "num_base_bdevs_operational": 3, 00:27:59.410 "process": { 00:27:59.410 "type": "rebuild", 00:27:59.410 "target": "spare", 00:27:59.410 "progress": { 00:27:59.410 "blocks": 51200, 00:27:59.410 "percent": 80 00:27:59.410 } 00:27:59.410 }, 00:27:59.410 "base_bdevs_list": [ 00:27:59.410 { 00:27:59.410 "name": "spare", 00:27:59.410 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:27:59.410 "is_configured": true, 00:27:59.410 "data_offset": 2048, 00:27:59.410 "data_size": 63488 00:27:59.410 }, 00:27:59.410 { 00:27:59.410 "name": null, 00:27:59.410 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:59.410 "is_configured": false, 00:27:59.410 "data_offset": 2048, 00:27:59.410 "data_size": 63488 00:27:59.410 }, 00:27:59.410 { 00:27:59.410 "name": "BaseBdev3", 00:27:59.410 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:27:59.410 "is_configured": true, 00:27:59.410 "data_offset": 2048, 00:27:59.410 "data_size": 63488 00:27:59.410 }, 00:27:59.410 { 00:27:59.410 "name": "BaseBdev4", 00:27:59.410 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:27:59.410 "is_configured": true, 00:27:59.410 "data_offset": 2048, 00:27:59.410 "data_size": 63488 00:27:59.410 } 00:27:59.410 ] 00:27:59.410 }' 00:27:59.410 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:59.410 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:59.410 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:59.410 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:59.410 12:09:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:59.669 [2024-07-15 12:09:13.029812] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:27:59.669 [2024-07-15 12:09:13.030134] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:27:59.669 [2024-07-15 12:09:13.232632] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:27:59.669 [2024-07-15 12:09:13.232830] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 59392 offset_begin: 55296 offset_end: 61440 00:28:00.236 [2024-07-15 12:09:13.564332] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:00.236 [2024-07-15 12:09:13.664614] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:00.236 [2024-07-15 12:09:13.666486] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:00.495 12:09:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:28:00.495 12:09:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:00.495 12:09:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.495 12:09:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:00.495 12:09:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:00.495 12:09:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.495 12:09:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.495 12:09:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:00.754 "name": "raid_bdev1", 00:28:00.754 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:00.754 "strip_size_kb": 0, 00:28:00.754 "state": "online", 00:28:00.754 "raid_level": "raid1", 00:28:00.754 "superblock": true, 00:28:00.754 "num_base_bdevs": 4, 00:28:00.754 "num_base_bdevs_discovered": 3, 00:28:00.754 "num_base_bdevs_operational": 3, 00:28:00.754 "base_bdevs_list": [ 00:28:00.754 { 00:28:00.754 "name": "spare", 00:28:00.754 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:28:00.754 "is_configured": true, 00:28:00.754 "data_offset": 2048, 00:28:00.754 "data_size": 63488 00:28:00.754 }, 00:28:00.754 { 00:28:00.754 "name": null, 00:28:00.754 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:00.754 "is_configured": false, 00:28:00.754 "data_offset": 2048, 00:28:00.754 "data_size": 63488 00:28:00.754 }, 00:28:00.754 { 00:28:00.754 "name": "BaseBdev3", 00:28:00.754 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:00.754 "is_configured": true, 00:28:00.754 "data_offset": 2048, 00:28:00.754 "data_size": 63488 00:28:00.754 }, 00:28:00.754 { 00:28:00.754 "name": "BaseBdev4", 00:28:00.754 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:00.754 "is_configured": true, 00:28:00.754 "data_offset": 2048, 00:28:00.754 "data_size": 63488 00:28:00.754 } 00:28:00.754 ] 00:28:00.754 }' 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.754 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:01.013 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:01.013 "name": "raid_bdev1", 00:28:01.013 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:01.013 "strip_size_kb": 0, 00:28:01.013 "state": "online", 00:28:01.013 "raid_level": "raid1", 00:28:01.013 "superblock": true, 00:28:01.013 "num_base_bdevs": 4, 00:28:01.013 "num_base_bdevs_discovered": 3, 00:28:01.013 "num_base_bdevs_operational": 3, 00:28:01.013 "base_bdevs_list": [ 00:28:01.013 { 00:28:01.013 "name": "spare", 00:28:01.013 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:28:01.013 "is_configured": true, 00:28:01.013 "data_offset": 2048, 00:28:01.013 "data_size": 63488 00:28:01.013 }, 00:28:01.013 { 00:28:01.013 "name": null, 00:28:01.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:01.013 "is_configured": false, 00:28:01.013 "data_offset": 2048, 00:28:01.013 "data_size": 63488 00:28:01.013 }, 00:28:01.013 { 00:28:01.013 "name": "BaseBdev3", 00:28:01.013 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:01.013 "is_configured": true, 00:28:01.014 "data_offset": 2048, 00:28:01.014 "data_size": 63488 00:28:01.014 }, 00:28:01.014 { 00:28:01.014 "name": "BaseBdev4", 00:28:01.014 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:01.014 "is_configured": true, 00:28:01.014 "data_offset": 2048, 00:28:01.014 "data_size": 63488 00:28:01.014 } 00:28:01.014 ] 00:28:01.014 }' 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:01.014 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:01.273 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.273 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:01.273 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:01.273 "name": "raid_bdev1", 00:28:01.273 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:01.273 "strip_size_kb": 0, 00:28:01.273 "state": "online", 00:28:01.273 "raid_level": "raid1", 00:28:01.273 "superblock": true, 00:28:01.273 "num_base_bdevs": 4, 00:28:01.273 "num_base_bdevs_discovered": 3, 00:28:01.273 "num_base_bdevs_operational": 3, 00:28:01.273 "base_bdevs_list": [ 00:28:01.273 { 00:28:01.273 "name": "spare", 00:28:01.273 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:28:01.273 "is_configured": true, 00:28:01.273 "data_offset": 2048, 00:28:01.273 "data_size": 63488 00:28:01.273 }, 00:28:01.273 { 00:28:01.273 "name": null, 00:28:01.273 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:01.273 "is_configured": false, 00:28:01.273 "data_offset": 2048, 00:28:01.273 "data_size": 63488 00:28:01.273 }, 00:28:01.273 { 00:28:01.273 "name": "BaseBdev3", 00:28:01.273 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:01.273 "is_configured": true, 00:28:01.273 "data_offset": 2048, 00:28:01.273 "data_size": 63488 00:28:01.273 }, 00:28:01.273 { 00:28:01.273 "name": "BaseBdev4", 00:28:01.273 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:01.273 "is_configured": true, 00:28:01.273 "data_offset": 2048, 00:28:01.273 "data_size": 63488 00:28:01.273 } 00:28:01.273 ] 00:28:01.273 }' 00:28:01.273 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:01.273 12:09:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:02.210 12:09:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:02.470 [2024-07-15 12:09:15.855840] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:02.470 [2024-07-15 12:09:15.855872] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:02.470 00:28:02.470 Latency(us) 00:28:02.470 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:02.470 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:28:02.470 raid_bdev1 : 11.73 88.55 265.65 0.00 0.00 15328.22 300.97 121270.09 00:28:02.470 =================================================================================================================== 00:28:02.470 Total : 88.55 265.65 0.00 0.00 15328.22 300.97 121270.09 00:28:02.470 [2024-07-15 12:09:15.887883] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:02.470 [2024-07-15 12:09:15.887913] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:02.470 [2024-07-15 12:09:15.888006] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:02.470 [2024-07-15 12:09:15.888019] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25dddc0 name raid_bdev1, state offline 00:28:02.470 0 00:28:02.470 12:09:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.470 12:09:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:02.729 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:28:02.988 /dev/nbd0 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:02.988 1+0 records in 00:28:02.988 1+0 records out 00:28:02.988 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269961 s, 15.2 MB/s 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:02.988 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:28:03.246 /dev/nbd1 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:03.246 1+0 records in 00:28:03.246 1+0 records out 00:28:03.246 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000266307 s, 15.4 MB/s 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:03.246 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:03.504 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:28:03.504 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:03.504 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:28:03.504 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:03.504 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:28:03.504 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:03.504 12:09:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:28:03.763 /dev/nbd1 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local i 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # break 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:03.763 1+0 records in 00:28:03.763 1+0 records out 00:28:03.763 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000274733 s, 14.9 MB/s 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:03.763 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # size=4096 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # return 0 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:04.022 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:04.282 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:28:04.541 12:09:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:04.800 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:05.059 [2024-07-15 12:09:18.439485] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:05.059 [2024-07-15 12:09:18.439533] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:05.059 [2024-07-15 12:09:18.439560] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25dd0c0 00:28:05.059 [2024-07-15 12:09:18.439574] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:05.059 [2024-07-15 12:09:18.441388] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:05.059 [2024-07-15 12:09:18.441421] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:05.059 [2024-07-15 12:09:18.441509] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:05.059 [2024-07-15 12:09:18.441538] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:05.059 [2024-07-15 12:09:18.441649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:28:05.059 [2024-07-15 12:09:18.441736] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:28:05.059 spare 00:28:05.059 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:28:05.059 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:05.059 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:05.060 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:05.060 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:05.060 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:28:05.060 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:05.060 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:05.060 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:05.060 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:05.060 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:05.060 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.060 [2024-07-15 12:09:18.542056] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25e0770 00:28:05.060 [2024-07-15 12:09:18.542075] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:28:05.060 [2024-07-15 12:09:18.542292] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2789ef0 00:28:05.060 [2024-07-15 12:09:18.542456] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25e0770 00:28:05.060 [2024-07-15 12:09:18.542466] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25e0770 00:28:05.060 [2024-07-15 12:09:18.542583] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:05.319 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:05.319 "name": "raid_bdev1", 00:28:05.319 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:05.319 "strip_size_kb": 0, 00:28:05.319 "state": "online", 00:28:05.319 "raid_level": "raid1", 00:28:05.319 "superblock": true, 00:28:05.319 "num_base_bdevs": 4, 00:28:05.319 "num_base_bdevs_discovered": 3, 00:28:05.319 "num_base_bdevs_operational": 3, 00:28:05.319 "base_bdevs_list": [ 00:28:05.319 { 00:28:05.319 "name": "spare", 00:28:05.319 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:28:05.319 "is_configured": true, 00:28:05.319 "data_offset": 2048, 00:28:05.319 "data_size": 63488 00:28:05.319 }, 00:28:05.319 { 00:28:05.319 "name": null, 00:28:05.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:05.319 "is_configured": false, 00:28:05.319 "data_offset": 2048, 00:28:05.319 "data_size": 63488 00:28:05.319 }, 00:28:05.319 { 00:28:05.319 "name": "BaseBdev3", 00:28:05.319 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:05.319 "is_configured": true, 00:28:05.319 "data_offset": 2048, 00:28:05.319 "data_size": 63488 00:28:05.319 }, 00:28:05.319 { 00:28:05.319 "name": "BaseBdev4", 00:28:05.319 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:05.319 "is_configured": true, 00:28:05.319 "data_offset": 2048, 00:28:05.319 "data_size": 63488 00:28:05.319 } 00:28:05.319 ] 00:28:05.319 }' 00:28:05.319 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:05.319 12:09:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:05.887 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:05.887 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:05.887 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:05.887 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:05.887 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:05.887 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.887 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:06.455 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:06.455 "name": "raid_bdev1", 00:28:06.455 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:06.455 "strip_size_kb": 0, 00:28:06.455 "state": "online", 00:28:06.455 "raid_level": "raid1", 00:28:06.455 "superblock": true, 00:28:06.455 "num_base_bdevs": 4, 00:28:06.455 "num_base_bdevs_discovered": 3, 00:28:06.455 "num_base_bdevs_operational": 3, 00:28:06.455 "base_bdevs_list": [ 00:28:06.455 { 00:28:06.455 "name": "spare", 00:28:06.455 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:28:06.456 "is_configured": true, 00:28:06.456 "data_offset": 2048, 00:28:06.456 "data_size": 63488 00:28:06.456 }, 00:28:06.456 { 00:28:06.456 "name": null, 00:28:06.456 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:06.456 "is_configured": false, 00:28:06.456 "data_offset": 2048, 00:28:06.456 "data_size": 63488 00:28:06.456 }, 00:28:06.456 { 00:28:06.456 "name": "BaseBdev3", 00:28:06.456 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:06.456 "is_configured": true, 00:28:06.456 "data_offset": 2048, 00:28:06.456 "data_size": 63488 00:28:06.456 }, 00:28:06.456 { 00:28:06.456 "name": "BaseBdev4", 00:28:06.456 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:06.456 "is_configured": true, 00:28:06.456 "data_offset": 2048, 00:28:06.456 "data_size": 63488 00:28:06.456 } 00:28:06.456 ] 00:28:06.456 }' 00:28:06.456 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:06.456 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:06.456 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:06.456 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:06.456 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.456 12:09:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:06.715 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:28:06.715 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:06.973 [2024-07-15 12:09:20.397114] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:06.973 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.231 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:07.231 "name": "raid_bdev1", 00:28:07.231 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:07.231 "strip_size_kb": 0, 00:28:07.231 "state": "online", 00:28:07.231 "raid_level": "raid1", 00:28:07.231 "superblock": true, 00:28:07.231 "num_base_bdevs": 4, 00:28:07.231 "num_base_bdevs_discovered": 2, 00:28:07.231 "num_base_bdevs_operational": 2, 00:28:07.231 "base_bdevs_list": [ 00:28:07.231 { 00:28:07.231 "name": null, 00:28:07.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.231 "is_configured": false, 00:28:07.231 "data_offset": 2048, 00:28:07.231 "data_size": 63488 00:28:07.231 }, 00:28:07.231 { 00:28:07.231 "name": null, 00:28:07.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:07.231 "is_configured": false, 00:28:07.231 "data_offset": 2048, 00:28:07.231 "data_size": 63488 00:28:07.231 }, 00:28:07.231 { 00:28:07.231 "name": "BaseBdev3", 00:28:07.231 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:07.231 "is_configured": true, 00:28:07.231 "data_offset": 2048, 00:28:07.231 "data_size": 63488 00:28:07.231 }, 00:28:07.231 { 00:28:07.231 "name": "BaseBdev4", 00:28:07.231 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:07.231 "is_configured": true, 00:28:07.231 "data_offset": 2048, 00:28:07.231 "data_size": 63488 00:28:07.231 } 00:28:07.231 ] 00:28:07.231 }' 00:28:07.231 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:07.231 12:09:20 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:07.796 12:09:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:08.055 [2024-07-15 12:09:21.399914] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:08.055 [2024-07-15 12:09:21.400084] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:28:08.055 [2024-07-15 12:09:21.400101] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:08.055 [2024-07-15 12:09:21.400129] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:08.055 [2024-07-15 12:09:21.404566] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22eb230 00:28:08.055 [2024-07-15 12:09:21.406711] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:08.055 12:09:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:28:08.989 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:08.989 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:08.989 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:08.989 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:08.989 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:08.989 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.989 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:09.247 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:09.247 "name": "raid_bdev1", 00:28:09.247 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:09.247 "strip_size_kb": 0, 00:28:09.247 "state": "online", 00:28:09.247 "raid_level": "raid1", 00:28:09.247 "superblock": true, 00:28:09.247 "num_base_bdevs": 4, 00:28:09.247 "num_base_bdevs_discovered": 3, 00:28:09.247 "num_base_bdevs_operational": 3, 00:28:09.247 "process": { 00:28:09.247 "type": "rebuild", 00:28:09.247 "target": "spare", 00:28:09.247 "progress": { 00:28:09.247 "blocks": 24576, 00:28:09.247 "percent": 38 00:28:09.247 } 00:28:09.247 }, 00:28:09.247 "base_bdevs_list": [ 00:28:09.247 { 00:28:09.247 "name": "spare", 00:28:09.247 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:28:09.247 "is_configured": true, 00:28:09.247 "data_offset": 2048, 00:28:09.247 "data_size": 63488 00:28:09.247 }, 00:28:09.247 { 00:28:09.247 "name": null, 00:28:09.247 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:09.247 "is_configured": false, 00:28:09.247 "data_offset": 2048, 00:28:09.247 "data_size": 63488 00:28:09.247 }, 00:28:09.247 { 00:28:09.247 "name": "BaseBdev3", 00:28:09.247 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:09.247 "is_configured": true, 00:28:09.247 "data_offset": 2048, 00:28:09.247 "data_size": 63488 00:28:09.247 }, 00:28:09.247 { 00:28:09.247 "name": "BaseBdev4", 00:28:09.247 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:09.247 "is_configured": true, 00:28:09.247 "data_offset": 2048, 00:28:09.247 "data_size": 63488 00:28:09.247 } 00:28:09.247 ] 00:28:09.247 }' 00:28:09.247 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:09.247 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:09.247 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:09.247 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:09.247 12:09:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:09.505 [2024-07-15 12:09:23.038309] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:09.765 [2024-07-15 12:09:23.120283] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:09.765 [2024-07-15 12:09:23.120328] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:09.765 [2024-07-15 12:09:23.120345] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:09.765 [2024-07-15 12:09:23.120353] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:09.765 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:10.023 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:10.023 "name": "raid_bdev1", 00:28:10.023 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:10.023 "strip_size_kb": 0, 00:28:10.023 "state": "online", 00:28:10.023 "raid_level": "raid1", 00:28:10.023 "superblock": true, 00:28:10.023 "num_base_bdevs": 4, 00:28:10.023 "num_base_bdevs_discovered": 2, 00:28:10.023 "num_base_bdevs_operational": 2, 00:28:10.023 "base_bdevs_list": [ 00:28:10.023 { 00:28:10.023 "name": null, 00:28:10.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:10.023 "is_configured": false, 00:28:10.023 "data_offset": 2048, 00:28:10.023 "data_size": 63488 00:28:10.023 }, 00:28:10.023 { 00:28:10.023 "name": null, 00:28:10.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:10.023 "is_configured": false, 00:28:10.023 "data_offset": 2048, 00:28:10.023 "data_size": 63488 00:28:10.023 }, 00:28:10.023 { 00:28:10.023 "name": "BaseBdev3", 00:28:10.023 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:10.023 "is_configured": true, 00:28:10.023 "data_offset": 2048, 00:28:10.023 "data_size": 63488 00:28:10.023 }, 00:28:10.023 { 00:28:10.023 "name": "BaseBdev4", 00:28:10.023 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:10.023 "is_configured": true, 00:28:10.023 "data_offset": 2048, 00:28:10.023 "data_size": 63488 00:28:10.023 } 00:28:10.023 ] 00:28:10.023 }' 00:28:10.023 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:10.023 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:10.590 12:09:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:10.850 [2024-07-15 12:09:24.199508] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:10.850 [2024-07-15 12:09:24.199561] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:10.850 [2024-07-15 12:09:24.199585] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e0be0 00:28:10.850 [2024-07-15 12:09:24.199598] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:10.850 [2024-07-15 12:09:24.199993] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:10.850 [2024-07-15 12:09:24.200013] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:10.850 [2024-07-15 12:09:24.200096] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:10.850 [2024-07-15 12:09:24.200108] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:28:10.850 [2024-07-15 12:09:24.200118] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:10.850 [2024-07-15 12:09:24.200137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:10.850 [2024-07-15 12:09:24.204591] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25fa8b0 00:28:10.850 spare 00:28:10.850 [2024-07-15 12:09:24.206091] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:10.850 12:09:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:28:11.784 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:11.784 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:11.784 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:11.784 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:11.784 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:11.784 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:11.784 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:12.042 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:12.042 "name": "raid_bdev1", 00:28:12.042 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:12.042 "strip_size_kb": 0, 00:28:12.042 "state": "online", 00:28:12.042 "raid_level": "raid1", 00:28:12.042 "superblock": true, 00:28:12.042 "num_base_bdevs": 4, 00:28:12.042 "num_base_bdevs_discovered": 3, 00:28:12.042 "num_base_bdevs_operational": 3, 00:28:12.042 "process": { 00:28:12.042 "type": "rebuild", 00:28:12.042 "target": "spare", 00:28:12.042 "progress": { 00:28:12.042 "blocks": 22528, 00:28:12.042 "percent": 35 00:28:12.042 } 00:28:12.042 }, 00:28:12.042 "base_bdevs_list": [ 00:28:12.042 { 00:28:12.042 "name": "spare", 00:28:12.042 "uuid": "e0ad79b3-a790-578f-8cf1-28dc9dfbd0e4", 00:28:12.042 "is_configured": true, 00:28:12.042 "data_offset": 2048, 00:28:12.042 "data_size": 63488 00:28:12.042 }, 00:28:12.042 { 00:28:12.042 "name": null, 00:28:12.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:12.042 "is_configured": false, 00:28:12.042 "data_offset": 2048, 00:28:12.042 "data_size": 63488 00:28:12.042 }, 00:28:12.042 { 00:28:12.042 "name": "BaseBdev3", 00:28:12.042 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:12.042 "is_configured": true, 00:28:12.042 "data_offset": 2048, 00:28:12.042 "data_size": 63488 00:28:12.042 }, 00:28:12.042 { 00:28:12.042 "name": "BaseBdev4", 00:28:12.042 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:12.042 "is_configured": true, 00:28:12.042 "data_offset": 2048, 00:28:12.042 "data_size": 63488 00:28:12.042 } 00:28:12.042 ] 00:28:12.042 }' 00:28:12.042 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:12.042 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:12.042 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:12.042 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:12.042 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:12.300 [2024-07-15 12:09:25.663102] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:12.300 [2024-07-15 12:09:25.717745] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:12.300 [2024-07-15 12:09:25.717789] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:12.300 [2024-07-15 12:09:25.717805] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:12.300 [2024-07-15 12:09:25.717813] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:12.300 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:12.558 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:12.558 "name": "raid_bdev1", 00:28:12.558 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:12.558 "strip_size_kb": 0, 00:28:12.558 "state": "online", 00:28:12.558 "raid_level": "raid1", 00:28:12.558 "superblock": true, 00:28:12.558 "num_base_bdevs": 4, 00:28:12.558 "num_base_bdevs_discovered": 2, 00:28:12.558 "num_base_bdevs_operational": 2, 00:28:12.558 "base_bdevs_list": [ 00:28:12.559 { 00:28:12.559 "name": null, 00:28:12.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:12.559 "is_configured": false, 00:28:12.559 "data_offset": 2048, 00:28:12.559 "data_size": 63488 00:28:12.559 }, 00:28:12.559 { 00:28:12.559 "name": null, 00:28:12.559 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:12.559 "is_configured": false, 00:28:12.559 "data_offset": 2048, 00:28:12.559 "data_size": 63488 00:28:12.559 }, 00:28:12.559 { 00:28:12.559 "name": "BaseBdev3", 00:28:12.559 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:12.559 "is_configured": true, 00:28:12.559 "data_offset": 2048, 00:28:12.559 "data_size": 63488 00:28:12.559 }, 00:28:12.559 { 00:28:12.559 "name": "BaseBdev4", 00:28:12.559 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:12.559 "is_configured": true, 00:28:12.559 "data_offset": 2048, 00:28:12.559 "data_size": 63488 00:28:12.559 } 00:28:12.559 ] 00:28:12.559 }' 00:28:12.559 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:12.559 12:09:25 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:13.125 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:13.125 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:13.125 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:13.125 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:13.125 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:13.125 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.125 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.385 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:13.385 "name": "raid_bdev1", 00:28:13.385 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:13.385 "strip_size_kb": 0, 00:28:13.385 "state": "online", 00:28:13.385 "raid_level": "raid1", 00:28:13.385 "superblock": true, 00:28:13.385 "num_base_bdevs": 4, 00:28:13.385 "num_base_bdevs_discovered": 2, 00:28:13.385 "num_base_bdevs_operational": 2, 00:28:13.385 "base_bdevs_list": [ 00:28:13.385 { 00:28:13.385 "name": null, 00:28:13.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:13.385 "is_configured": false, 00:28:13.385 "data_offset": 2048, 00:28:13.385 "data_size": 63488 00:28:13.385 }, 00:28:13.385 { 00:28:13.385 "name": null, 00:28:13.385 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:13.385 "is_configured": false, 00:28:13.385 "data_offset": 2048, 00:28:13.385 "data_size": 63488 00:28:13.385 }, 00:28:13.385 { 00:28:13.385 "name": "BaseBdev3", 00:28:13.385 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:13.385 "is_configured": true, 00:28:13.385 "data_offset": 2048, 00:28:13.385 "data_size": 63488 00:28:13.385 }, 00:28:13.385 { 00:28:13.385 "name": "BaseBdev4", 00:28:13.385 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:13.385 "is_configured": true, 00:28:13.385 "data_offset": 2048, 00:28:13.385 "data_size": 63488 00:28:13.385 } 00:28:13.385 ] 00:28:13.385 }' 00:28:13.385 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:13.385 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:13.385 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:13.385 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:13.385 12:09:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:13.644 12:09:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:13.903 [2024-07-15 12:09:27.294521] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:13.903 [2024-07-15 12:09:27.294568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:13.903 [2024-07-15 12:09:27.294589] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e3bf0 00:28:13.903 [2024-07-15 12:09:27.294602] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:13.903 [2024-07-15 12:09:27.294964] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:13.903 [2024-07-15 12:09:27.294986] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:13.903 [2024-07-15 12:09:27.295052] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:13.903 [2024-07-15 12:09:27.295064] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:28:13.903 [2024-07-15 12:09:27.295075] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:13.903 BaseBdev1 00:28:13.903 12:09:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.840 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.100 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:15.100 "name": "raid_bdev1", 00:28:15.100 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:15.100 "strip_size_kb": 0, 00:28:15.100 "state": "online", 00:28:15.100 "raid_level": "raid1", 00:28:15.100 "superblock": true, 00:28:15.100 "num_base_bdevs": 4, 00:28:15.100 "num_base_bdevs_discovered": 2, 00:28:15.100 "num_base_bdevs_operational": 2, 00:28:15.100 "base_bdevs_list": [ 00:28:15.100 { 00:28:15.100 "name": null, 00:28:15.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.100 "is_configured": false, 00:28:15.100 "data_offset": 2048, 00:28:15.100 "data_size": 63488 00:28:15.100 }, 00:28:15.100 { 00:28:15.100 "name": null, 00:28:15.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.100 "is_configured": false, 00:28:15.100 "data_offset": 2048, 00:28:15.100 "data_size": 63488 00:28:15.100 }, 00:28:15.100 { 00:28:15.100 "name": "BaseBdev3", 00:28:15.100 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:15.100 "is_configured": true, 00:28:15.100 "data_offset": 2048, 00:28:15.100 "data_size": 63488 00:28:15.100 }, 00:28:15.100 { 00:28:15.100 "name": "BaseBdev4", 00:28:15.100 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:15.100 "is_configured": true, 00:28:15.100 "data_offset": 2048, 00:28:15.100 "data_size": 63488 00:28:15.100 } 00:28:15.100 ] 00:28:15.100 }' 00:28:15.100 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:15.100 12:09:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:15.668 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:15.668 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:15.668 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:15.668 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:15.668 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:15.668 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.668 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.929 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:15.929 "name": "raid_bdev1", 00:28:15.929 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:15.929 "strip_size_kb": 0, 00:28:15.929 "state": "online", 00:28:15.929 "raid_level": "raid1", 00:28:15.929 "superblock": true, 00:28:15.929 "num_base_bdevs": 4, 00:28:15.929 "num_base_bdevs_discovered": 2, 00:28:15.929 "num_base_bdevs_operational": 2, 00:28:15.929 "base_bdevs_list": [ 00:28:15.929 { 00:28:15.929 "name": null, 00:28:15.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.929 "is_configured": false, 00:28:15.929 "data_offset": 2048, 00:28:15.929 "data_size": 63488 00:28:15.929 }, 00:28:15.929 { 00:28:15.929 "name": null, 00:28:15.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.929 "is_configured": false, 00:28:15.929 "data_offset": 2048, 00:28:15.929 "data_size": 63488 00:28:15.929 }, 00:28:15.929 { 00:28:15.929 "name": "BaseBdev3", 00:28:15.929 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:15.929 "is_configured": true, 00:28:15.929 "data_offset": 2048, 00:28:15.929 "data_size": 63488 00:28:15.929 }, 00:28:15.929 { 00:28:15.929 "name": "BaseBdev4", 00:28:15.929 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:15.929 "is_configured": true, 00:28:15.929 "data_offset": 2048, 00:28:15.929 "data_size": 63488 00:28:15.929 } 00:28:15.929 ] 00:28:15.929 }' 00:28:15.929 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:15.929 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:15.929 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@648 -- # local es=0 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:16.189 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:16.189 [2024-07-15 12:09:29.773425] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:16.189 [2024-07-15 12:09:29.773553] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:28:16.189 [2024-07-15 12:09:29.773569] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:16.189 request: 00:28:16.189 { 00:28:16.189 "base_bdev": "BaseBdev1", 00:28:16.190 "raid_bdev": "raid_bdev1", 00:28:16.190 "method": "bdev_raid_add_base_bdev", 00:28:16.190 "req_id": 1 00:28:16.190 } 00:28:16.190 Got JSON-RPC error response 00:28:16.190 response: 00:28:16.190 { 00:28:16.190 "code": -22, 00:28:16.190 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:16.190 } 00:28:16.450 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # es=1 00:28:16.450 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:16.450 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:16.450 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:16.450 12:09:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.390 12:09:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.649 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:17.649 "name": "raid_bdev1", 00:28:17.649 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:17.649 "strip_size_kb": 0, 00:28:17.649 "state": "online", 00:28:17.649 "raid_level": "raid1", 00:28:17.649 "superblock": true, 00:28:17.649 "num_base_bdevs": 4, 00:28:17.649 "num_base_bdevs_discovered": 2, 00:28:17.649 "num_base_bdevs_operational": 2, 00:28:17.649 "base_bdevs_list": [ 00:28:17.649 { 00:28:17.649 "name": null, 00:28:17.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.649 "is_configured": false, 00:28:17.649 "data_offset": 2048, 00:28:17.649 "data_size": 63488 00:28:17.649 }, 00:28:17.649 { 00:28:17.649 "name": null, 00:28:17.649 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.649 "is_configured": false, 00:28:17.649 "data_offset": 2048, 00:28:17.649 "data_size": 63488 00:28:17.649 }, 00:28:17.649 { 00:28:17.649 "name": "BaseBdev3", 00:28:17.649 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:17.649 "is_configured": true, 00:28:17.649 "data_offset": 2048, 00:28:17.649 "data_size": 63488 00:28:17.649 }, 00:28:17.649 { 00:28:17.649 "name": "BaseBdev4", 00:28:17.649 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:17.649 "is_configured": true, 00:28:17.649 "data_offset": 2048, 00:28:17.649 "data_size": 63488 00:28:17.649 } 00:28:17.649 ] 00:28:17.649 }' 00:28:17.649 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:17.649 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:18.216 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:18.216 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:18.216 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:18.216 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:18.216 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:18.216 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:18.216 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.475 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:18.475 "name": "raid_bdev1", 00:28:18.475 "uuid": "75ddf800-c31d-4750-aa54-858143a3ef63", 00:28:18.475 "strip_size_kb": 0, 00:28:18.475 "state": "online", 00:28:18.475 "raid_level": "raid1", 00:28:18.475 "superblock": true, 00:28:18.475 "num_base_bdevs": 4, 00:28:18.475 "num_base_bdevs_discovered": 2, 00:28:18.475 "num_base_bdevs_operational": 2, 00:28:18.475 "base_bdevs_list": [ 00:28:18.475 { 00:28:18.475 "name": null, 00:28:18.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:18.475 "is_configured": false, 00:28:18.475 "data_offset": 2048, 00:28:18.475 "data_size": 63488 00:28:18.475 }, 00:28:18.475 { 00:28:18.475 "name": null, 00:28:18.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:18.475 "is_configured": false, 00:28:18.475 "data_offset": 2048, 00:28:18.475 "data_size": 63488 00:28:18.475 }, 00:28:18.475 { 00:28:18.475 "name": "BaseBdev3", 00:28:18.475 "uuid": "cdbf05eb-1e4f-5ee3-9549-0f5b8a3698b0", 00:28:18.475 "is_configured": true, 00:28:18.475 "data_offset": 2048, 00:28:18.475 "data_size": 63488 00:28:18.475 }, 00:28:18.475 { 00:28:18.475 "name": "BaseBdev4", 00:28:18.475 "uuid": "72c2a6db-594a-529b-95c2-9b7402df3264", 00:28:18.475 "is_configured": true, 00:28:18.475 "data_offset": 2048, 00:28:18.475 "data_size": 63488 00:28:18.475 } 00:28:18.475 ] 00:28:18.475 }' 00:28:18.475 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:18.475 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:18.475 12:09:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:18.475 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:18.475 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 1591753 00:28:18.475 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@948 -- # '[' -z 1591753 ']' 00:28:18.475 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@952 -- # kill -0 1591753 00:28:18.475 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # uname 00:28:18.475 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:18.475 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1591753 00:28:18.735 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:18.735 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:18.735 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1591753' 00:28:18.735 killing process with pid 1591753 00:28:18.735 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # kill 1591753 00:28:18.735 Received shutdown signal, test time was about 27.900135 seconds 00:28:18.735 00:28:18.735 Latency(us) 00:28:18.735 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:18.735 =================================================================================================================== 00:28:18.735 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:18.735 [2024-07-15 12:09:32.090006] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:18.735 [2024-07-15 12:09:32.090113] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:18.735 [2024-07-15 12:09:32.090173] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:18.735 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@972 -- # wait 1591753 00:28:18.735 [2024-07-15 12:09:32.090186] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25e0770 name raid_bdev1, state offline 00:28:18.735 [2024-07-15 12:09:32.131018] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:18.995 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:28:18.995 00:28:18.995 real 0m33.878s 00:28:18.995 user 0m53.721s 00:28:18.995 sys 0m5.285s 00:28:18.995 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:18.995 12:09:32 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:28:18.995 ************************************ 00:28:18.995 END TEST raid_rebuild_test_sb_io 00:28:18.995 ************************************ 00:28:18.995 12:09:32 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:18.995 12:09:32 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:28:18.995 12:09:32 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:28:18.995 12:09:32 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:28:18.995 12:09:32 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:28:18.995 12:09:32 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:18.995 12:09:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:18.995 ************************************ 00:28:18.995 START TEST raid_state_function_test_sb_4k 00:28:18.995 ************************************ 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1597030 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1597030' 00:28:18.995 Process raid pid: 1597030 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1597030 /var/tmp/spdk-raid.sock 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1597030 ']' 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:18.995 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:18.995 12:09:32 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:18.995 [2024-07-15 12:09:32.502387] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:28:18.995 [2024-07-15 12:09:32.502458] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:28:19.254 [2024-07-15 12:09:32.633211] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:19.254 [2024-07-15 12:09:32.738962] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:19.254 [2024-07-15 12:09:32.803516] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:19.254 [2024-07-15 12:09:32.803551] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:20.241 [2024-07-15 12:09:33.667055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:20.241 [2024-07-15 12:09:33.667101] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:20.241 [2024-07-15 12:09:33.667112] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:20.241 [2024-07-15 12:09:33.667124] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.241 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:20.535 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:20.535 "name": "Existed_Raid", 00:28:20.535 "uuid": "b516c7dc-405f-40ba-b505-0f7b541d0c2e", 00:28:20.535 "strip_size_kb": 0, 00:28:20.535 "state": "configuring", 00:28:20.535 "raid_level": "raid1", 00:28:20.535 "superblock": true, 00:28:20.535 "num_base_bdevs": 2, 00:28:20.535 "num_base_bdevs_discovered": 0, 00:28:20.535 "num_base_bdevs_operational": 2, 00:28:20.535 "base_bdevs_list": [ 00:28:20.535 { 00:28:20.535 "name": "BaseBdev1", 00:28:20.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.535 "is_configured": false, 00:28:20.535 "data_offset": 0, 00:28:20.535 "data_size": 0 00:28:20.535 }, 00:28:20.535 { 00:28:20.535 "name": "BaseBdev2", 00:28:20.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:20.535 "is_configured": false, 00:28:20.536 "data_offset": 0, 00:28:20.536 "data_size": 0 00:28:20.536 } 00:28:20.536 ] 00:28:20.536 }' 00:28:20.536 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:20.536 12:09:33 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:21.102 12:09:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:21.361 [2024-07-15 12:09:34.797886] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:21.361 [2024-07-15 12:09:34.797917] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22aab00 name Existed_Raid, state configuring 00:28:21.361 12:09:34 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:21.620 [2024-07-15 12:09:35.054582] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:28:21.620 [2024-07-15 12:09:35.054614] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:28:21.620 [2024-07-15 12:09:35.054624] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:21.620 [2024-07-15 12:09:35.054636] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:21.620 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:28:21.878 [2024-07-15 12:09:35.323008] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:21.878 BaseBdev1 00:28:21.878 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:28:21.878 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:28:21.878 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:21.878 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:28:21.878 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:21.878 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:21.878 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:22.136 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:28:22.395 [ 00:28:22.395 { 00:28:22.395 "name": "BaseBdev1", 00:28:22.395 "aliases": [ 00:28:22.395 "55bb818f-ed41-4b0b-a025-482d504b8d33" 00:28:22.395 ], 00:28:22.395 "product_name": "Malloc disk", 00:28:22.395 "block_size": 4096, 00:28:22.395 "num_blocks": 8192, 00:28:22.395 "uuid": "55bb818f-ed41-4b0b-a025-482d504b8d33", 00:28:22.395 "assigned_rate_limits": { 00:28:22.395 "rw_ios_per_sec": 0, 00:28:22.395 "rw_mbytes_per_sec": 0, 00:28:22.395 "r_mbytes_per_sec": 0, 00:28:22.395 "w_mbytes_per_sec": 0 00:28:22.395 }, 00:28:22.395 "claimed": true, 00:28:22.395 "claim_type": "exclusive_write", 00:28:22.395 "zoned": false, 00:28:22.395 "supported_io_types": { 00:28:22.395 "read": true, 00:28:22.395 "write": true, 00:28:22.395 "unmap": true, 00:28:22.395 "flush": true, 00:28:22.395 "reset": true, 00:28:22.395 "nvme_admin": false, 00:28:22.395 "nvme_io": false, 00:28:22.395 "nvme_io_md": false, 00:28:22.395 "write_zeroes": true, 00:28:22.395 "zcopy": true, 00:28:22.395 "get_zone_info": false, 00:28:22.395 "zone_management": false, 00:28:22.395 "zone_append": false, 00:28:22.395 "compare": false, 00:28:22.395 "compare_and_write": false, 00:28:22.395 "abort": true, 00:28:22.395 "seek_hole": false, 00:28:22.395 "seek_data": false, 00:28:22.395 "copy": true, 00:28:22.395 "nvme_iov_md": false 00:28:22.395 }, 00:28:22.395 "memory_domains": [ 00:28:22.395 { 00:28:22.395 "dma_device_id": "system", 00:28:22.395 "dma_device_type": 1 00:28:22.395 }, 00:28:22.395 { 00:28:22.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:22.395 "dma_device_type": 2 00:28:22.395 } 00:28:22.395 ], 00:28:22.395 "driver_specific": {} 00:28:22.395 } 00:28:22.395 ] 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:22.395 12:09:35 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:22.654 12:09:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:22.654 "name": "Existed_Raid", 00:28:22.654 "uuid": "769c723a-a799-426c-92ca-032e7ea70e55", 00:28:22.654 "strip_size_kb": 0, 00:28:22.654 "state": "configuring", 00:28:22.654 "raid_level": "raid1", 00:28:22.654 "superblock": true, 00:28:22.654 "num_base_bdevs": 2, 00:28:22.654 "num_base_bdevs_discovered": 1, 00:28:22.654 "num_base_bdevs_operational": 2, 00:28:22.654 "base_bdevs_list": [ 00:28:22.654 { 00:28:22.654 "name": "BaseBdev1", 00:28:22.654 "uuid": "55bb818f-ed41-4b0b-a025-482d504b8d33", 00:28:22.654 "is_configured": true, 00:28:22.654 "data_offset": 256, 00:28:22.654 "data_size": 7936 00:28:22.654 }, 00:28:22.654 { 00:28:22.654 "name": "BaseBdev2", 00:28:22.654 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:22.654 "is_configured": false, 00:28:22.654 "data_offset": 0, 00:28:22.654 "data_size": 0 00:28:22.654 } 00:28:22.654 ] 00:28:22.654 }' 00:28:22.654 12:09:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:22.654 12:09:36 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:23.221 12:09:36 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:28:23.480 [2024-07-15 12:09:37.003469] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:28:23.480 [2024-07-15 12:09:37.003513] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22aa3d0 name Existed_Raid, state configuring 00:28:23.480 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:28:23.738 [2024-07-15 12:09:37.264182] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:23.738 [2024-07-15 12:09:37.265730] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:28:23.738 [2024-07-15 12:09:37.265762] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:23.738 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:23.997 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:23.997 "name": "Existed_Raid", 00:28:23.997 "uuid": "8c3f6865-d284-4675-bcdc-4408f6590235", 00:28:23.997 "strip_size_kb": 0, 00:28:23.997 "state": "configuring", 00:28:23.997 "raid_level": "raid1", 00:28:23.997 "superblock": true, 00:28:23.997 "num_base_bdevs": 2, 00:28:23.997 "num_base_bdevs_discovered": 1, 00:28:23.997 "num_base_bdevs_operational": 2, 00:28:23.997 "base_bdevs_list": [ 00:28:23.997 { 00:28:23.997 "name": "BaseBdev1", 00:28:23.997 "uuid": "55bb818f-ed41-4b0b-a025-482d504b8d33", 00:28:23.997 "is_configured": true, 00:28:23.997 "data_offset": 256, 00:28:23.997 "data_size": 7936 00:28:23.997 }, 00:28:23.997 { 00:28:23.997 "name": "BaseBdev2", 00:28:23.997 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:23.997 "is_configured": false, 00:28:23.997 "data_offset": 0, 00:28:23.997 "data_size": 0 00:28:23.997 } 00:28:23.997 ] 00:28:23.997 }' 00:28:23.997 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:23.997 12:09:37 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:24.934 12:09:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:28:24.934 [2024-07-15 12:09:38.420484] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:24.934 [2024-07-15 12:09:38.420650] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x22ab150 00:28:24.934 [2024-07-15 12:09:38.420664] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:24.934 [2024-07-15 12:09:38.420848] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21c5420 00:28:24.934 [2024-07-15 12:09:38.420977] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22ab150 00:28:24.934 [2024-07-15 12:09:38.420988] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22ab150 00:28:24.934 [2024-07-15 12:09:38.421084] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:24.934 BaseBdev2 00:28:24.934 12:09:38 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:28:24.934 12:09:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:28:24.934 12:09:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:28:24.934 12:09:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local i 00:28:24.934 12:09:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:28:24.934 12:09:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:28:24.934 12:09:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:28:25.193 12:09:38 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:28:25.452 [ 00:28:25.452 { 00:28:25.452 "name": "BaseBdev2", 00:28:25.452 "aliases": [ 00:28:25.452 "2a6f88bc-bf09-43ea-97e6-e26ba2cf954a" 00:28:25.452 ], 00:28:25.452 "product_name": "Malloc disk", 00:28:25.452 "block_size": 4096, 00:28:25.452 "num_blocks": 8192, 00:28:25.452 "uuid": "2a6f88bc-bf09-43ea-97e6-e26ba2cf954a", 00:28:25.452 "assigned_rate_limits": { 00:28:25.452 "rw_ios_per_sec": 0, 00:28:25.452 "rw_mbytes_per_sec": 0, 00:28:25.452 "r_mbytes_per_sec": 0, 00:28:25.452 "w_mbytes_per_sec": 0 00:28:25.452 }, 00:28:25.452 "claimed": true, 00:28:25.452 "claim_type": "exclusive_write", 00:28:25.452 "zoned": false, 00:28:25.452 "supported_io_types": { 00:28:25.452 "read": true, 00:28:25.452 "write": true, 00:28:25.452 "unmap": true, 00:28:25.452 "flush": true, 00:28:25.452 "reset": true, 00:28:25.452 "nvme_admin": false, 00:28:25.452 "nvme_io": false, 00:28:25.452 "nvme_io_md": false, 00:28:25.452 "write_zeroes": true, 00:28:25.452 "zcopy": true, 00:28:25.452 "get_zone_info": false, 00:28:25.452 "zone_management": false, 00:28:25.452 "zone_append": false, 00:28:25.452 "compare": false, 00:28:25.452 "compare_and_write": false, 00:28:25.452 "abort": true, 00:28:25.452 "seek_hole": false, 00:28:25.452 "seek_data": false, 00:28:25.452 "copy": true, 00:28:25.452 "nvme_iov_md": false 00:28:25.452 }, 00:28:25.452 "memory_domains": [ 00:28:25.452 { 00:28:25.452 "dma_device_id": "system", 00:28:25.452 "dma_device_type": 1 00:28:25.452 }, 00:28:25.452 { 00:28:25.452 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:25.452 "dma_device_type": 2 00:28:25.452 } 00:28:25.452 ], 00:28:25.452 "driver_specific": {} 00:28:25.452 } 00:28:25.452 ] 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # return 0 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:25.710 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:25.711 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:25.711 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:25.711 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:25.711 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:25.711 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:26.277 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:26.277 "name": "Existed_Raid", 00:28:26.277 "uuid": "8c3f6865-d284-4675-bcdc-4408f6590235", 00:28:26.277 "strip_size_kb": 0, 00:28:26.277 "state": "online", 00:28:26.277 "raid_level": "raid1", 00:28:26.277 "superblock": true, 00:28:26.277 "num_base_bdevs": 2, 00:28:26.277 "num_base_bdevs_discovered": 2, 00:28:26.277 "num_base_bdevs_operational": 2, 00:28:26.277 "base_bdevs_list": [ 00:28:26.277 { 00:28:26.277 "name": "BaseBdev1", 00:28:26.277 "uuid": "55bb818f-ed41-4b0b-a025-482d504b8d33", 00:28:26.277 "is_configured": true, 00:28:26.277 "data_offset": 256, 00:28:26.277 "data_size": 7936 00:28:26.277 }, 00:28:26.277 { 00:28:26.277 "name": "BaseBdev2", 00:28:26.277 "uuid": "2a6f88bc-bf09-43ea-97e6-e26ba2cf954a", 00:28:26.277 "is_configured": true, 00:28:26.277 "data_offset": 256, 00:28:26.277 "data_size": 7936 00:28:26.277 } 00:28:26.277 ] 00:28:26.277 }' 00:28:26.277 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:26.277 12:09:39 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:26.844 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:28:26.844 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:28:26.844 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:26.844 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:26.844 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:26.844 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:26.844 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:28:26.844 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:27.103 [2024-07-15 12:09:40.518352] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:27.103 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:27.103 "name": "Existed_Raid", 00:28:27.103 "aliases": [ 00:28:27.103 "8c3f6865-d284-4675-bcdc-4408f6590235" 00:28:27.103 ], 00:28:27.103 "product_name": "Raid Volume", 00:28:27.103 "block_size": 4096, 00:28:27.103 "num_blocks": 7936, 00:28:27.103 "uuid": "8c3f6865-d284-4675-bcdc-4408f6590235", 00:28:27.103 "assigned_rate_limits": { 00:28:27.103 "rw_ios_per_sec": 0, 00:28:27.103 "rw_mbytes_per_sec": 0, 00:28:27.103 "r_mbytes_per_sec": 0, 00:28:27.103 "w_mbytes_per_sec": 0 00:28:27.103 }, 00:28:27.103 "claimed": false, 00:28:27.103 "zoned": false, 00:28:27.103 "supported_io_types": { 00:28:27.103 "read": true, 00:28:27.103 "write": true, 00:28:27.103 "unmap": false, 00:28:27.103 "flush": false, 00:28:27.103 "reset": true, 00:28:27.103 "nvme_admin": false, 00:28:27.103 "nvme_io": false, 00:28:27.103 "nvme_io_md": false, 00:28:27.103 "write_zeroes": true, 00:28:27.103 "zcopy": false, 00:28:27.103 "get_zone_info": false, 00:28:27.103 "zone_management": false, 00:28:27.103 "zone_append": false, 00:28:27.103 "compare": false, 00:28:27.103 "compare_and_write": false, 00:28:27.103 "abort": false, 00:28:27.103 "seek_hole": false, 00:28:27.103 "seek_data": false, 00:28:27.103 "copy": false, 00:28:27.103 "nvme_iov_md": false 00:28:27.103 }, 00:28:27.103 "memory_domains": [ 00:28:27.103 { 00:28:27.103 "dma_device_id": "system", 00:28:27.103 "dma_device_type": 1 00:28:27.103 }, 00:28:27.103 { 00:28:27.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:27.103 "dma_device_type": 2 00:28:27.103 }, 00:28:27.103 { 00:28:27.103 "dma_device_id": "system", 00:28:27.103 "dma_device_type": 1 00:28:27.103 }, 00:28:27.103 { 00:28:27.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:27.103 "dma_device_type": 2 00:28:27.103 } 00:28:27.103 ], 00:28:27.103 "driver_specific": { 00:28:27.103 "raid": { 00:28:27.103 "uuid": "8c3f6865-d284-4675-bcdc-4408f6590235", 00:28:27.103 "strip_size_kb": 0, 00:28:27.103 "state": "online", 00:28:27.103 "raid_level": "raid1", 00:28:27.103 "superblock": true, 00:28:27.103 "num_base_bdevs": 2, 00:28:27.103 "num_base_bdevs_discovered": 2, 00:28:27.103 "num_base_bdevs_operational": 2, 00:28:27.103 "base_bdevs_list": [ 00:28:27.103 { 00:28:27.103 "name": "BaseBdev1", 00:28:27.103 "uuid": "55bb818f-ed41-4b0b-a025-482d504b8d33", 00:28:27.103 "is_configured": true, 00:28:27.103 "data_offset": 256, 00:28:27.103 "data_size": 7936 00:28:27.103 }, 00:28:27.103 { 00:28:27.103 "name": "BaseBdev2", 00:28:27.103 "uuid": "2a6f88bc-bf09-43ea-97e6-e26ba2cf954a", 00:28:27.103 "is_configured": true, 00:28:27.103 "data_offset": 256, 00:28:27.103 "data_size": 7936 00:28:27.103 } 00:28:27.103 ] 00:28:27.103 } 00:28:27.103 } 00:28:27.103 }' 00:28:27.103 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:27.103 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:28:27.103 BaseBdev2' 00:28:27.103 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:27.103 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:28:27.103 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:27.362 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:27.362 "name": "BaseBdev1", 00:28:27.362 "aliases": [ 00:28:27.362 "55bb818f-ed41-4b0b-a025-482d504b8d33" 00:28:27.362 ], 00:28:27.362 "product_name": "Malloc disk", 00:28:27.362 "block_size": 4096, 00:28:27.362 "num_blocks": 8192, 00:28:27.362 "uuid": "55bb818f-ed41-4b0b-a025-482d504b8d33", 00:28:27.362 "assigned_rate_limits": { 00:28:27.362 "rw_ios_per_sec": 0, 00:28:27.362 "rw_mbytes_per_sec": 0, 00:28:27.362 "r_mbytes_per_sec": 0, 00:28:27.362 "w_mbytes_per_sec": 0 00:28:27.362 }, 00:28:27.362 "claimed": true, 00:28:27.362 "claim_type": "exclusive_write", 00:28:27.362 "zoned": false, 00:28:27.362 "supported_io_types": { 00:28:27.362 "read": true, 00:28:27.362 "write": true, 00:28:27.362 "unmap": true, 00:28:27.362 "flush": true, 00:28:27.362 "reset": true, 00:28:27.362 "nvme_admin": false, 00:28:27.362 "nvme_io": false, 00:28:27.362 "nvme_io_md": false, 00:28:27.362 "write_zeroes": true, 00:28:27.362 "zcopy": true, 00:28:27.362 "get_zone_info": false, 00:28:27.362 "zone_management": false, 00:28:27.362 "zone_append": false, 00:28:27.362 "compare": false, 00:28:27.362 "compare_and_write": false, 00:28:27.362 "abort": true, 00:28:27.362 "seek_hole": false, 00:28:27.362 "seek_data": false, 00:28:27.362 "copy": true, 00:28:27.362 "nvme_iov_md": false 00:28:27.362 }, 00:28:27.362 "memory_domains": [ 00:28:27.362 { 00:28:27.362 "dma_device_id": "system", 00:28:27.362 "dma_device_type": 1 00:28:27.362 }, 00:28:27.362 { 00:28:27.362 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:27.362 "dma_device_type": 2 00:28:27.362 } 00:28:27.362 ], 00:28:27.362 "driver_specific": {} 00:28:27.362 }' 00:28:27.362 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:27.362 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:27.362 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:27.362 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:27.621 12:09:40 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:28:27.621 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:27.879 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:27.879 "name": "BaseBdev2", 00:28:27.879 "aliases": [ 00:28:27.879 "2a6f88bc-bf09-43ea-97e6-e26ba2cf954a" 00:28:27.879 ], 00:28:27.879 "product_name": "Malloc disk", 00:28:27.879 "block_size": 4096, 00:28:27.879 "num_blocks": 8192, 00:28:27.879 "uuid": "2a6f88bc-bf09-43ea-97e6-e26ba2cf954a", 00:28:27.879 "assigned_rate_limits": { 00:28:27.879 "rw_ios_per_sec": 0, 00:28:27.879 "rw_mbytes_per_sec": 0, 00:28:27.879 "r_mbytes_per_sec": 0, 00:28:27.879 "w_mbytes_per_sec": 0 00:28:27.879 }, 00:28:27.879 "claimed": true, 00:28:27.879 "claim_type": "exclusive_write", 00:28:27.879 "zoned": false, 00:28:27.879 "supported_io_types": { 00:28:27.879 "read": true, 00:28:27.879 "write": true, 00:28:27.879 "unmap": true, 00:28:27.879 "flush": true, 00:28:27.879 "reset": true, 00:28:27.879 "nvme_admin": false, 00:28:27.879 "nvme_io": false, 00:28:27.879 "nvme_io_md": false, 00:28:27.879 "write_zeroes": true, 00:28:27.879 "zcopy": true, 00:28:27.879 "get_zone_info": false, 00:28:27.879 "zone_management": false, 00:28:27.879 "zone_append": false, 00:28:27.879 "compare": false, 00:28:27.879 "compare_and_write": false, 00:28:27.879 "abort": true, 00:28:27.879 "seek_hole": false, 00:28:27.879 "seek_data": false, 00:28:27.879 "copy": true, 00:28:27.879 "nvme_iov_md": false 00:28:27.879 }, 00:28:27.879 "memory_domains": [ 00:28:27.879 { 00:28:27.879 "dma_device_id": "system", 00:28:27.879 "dma_device_type": 1 00:28:27.879 }, 00:28:27.879 { 00:28:27.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:27.879 "dma_device_type": 2 00:28:27.879 } 00:28:27.879 ], 00:28:27.879 "driver_specific": {} 00:28:27.879 }' 00:28:27.879 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:28.138 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:28.138 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:28.138 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:28.138 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:28.138 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:28.138 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:28.138 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:28.398 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:28.398 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:28.398 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:28.398 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:28.398 12:09:41 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:28:28.658 [2024-07-15 12:09:42.054218] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:28.658 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:28:28.917 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:28.917 "name": "Existed_Raid", 00:28:28.917 "uuid": "8c3f6865-d284-4675-bcdc-4408f6590235", 00:28:28.917 "strip_size_kb": 0, 00:28:28.917 "state": "online", 00:28:28.917 "raid_level": "raid1", 00:28:28.917 "superblock": true, 00:28:28.917 "num_base_bdevs": 2, 00:28:28.917 "num_base_bdevs_discovered": 1, 00:28:28.917 "num_base_bdevs_operational": 1, 00:28:28.917 "base_bdevs_list": [ 00:28:28.917 { 00:28:28.917 "name": null, 00:28:28.917 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:28.917 "is_configured": false, 00:28:28.917 "data_offset": 256, 00:28:28.917 "data_size": 7936 00:28:28.917 }, 00:28:28.917 { 00:28:28.917 "name": "BaseBdev2", 00:28:28.917 "uuid": "2a6f88bc-bf09-43ea-97e6-e26ba2cf954a", 00:28:28.917 "is_configured": true, 00:28:28.917 "data_offset": 256, 00:28:28.917 "data_size": 7936 00:28:28.917 } 00:28:28.917 ] 00:28:28.917 }' 00:28:28.917 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:28.917 12:09:42 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:29.852 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:28:29.852 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:29.852 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:28:29.852 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.110 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:28:30.110 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:28:30.110 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:28:30.369 [2024-07-15 12:09:43.714928] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:28:30.369 [2024-07-15 12:09:43.715026] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:30.369 [2024-07-15 12:09:43.741103] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:30.369 [2024-07-15 12:09:43.741146] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:30.369 [2024-07-15 12:09:43.741158] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22ab150 name Existed_Raid, state offline 00:28:30.369 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:28:30.369 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:28:30.369 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:28:30.370 12:09:43 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1597030 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1597030 ']' 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1597030 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1597030 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1597030' 00:28:30.629 killing process with pid 1597030 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1597030 00:28:30.629 [2024-07-15 12:09:44.057172] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:30.629 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1597030 00:28:30.629 [2024-07-15 12:09:44.058970] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:30.888 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:28:30.888 00:28:30.888 real 0m12.031s 00:28:30.888 user 0m21.249s 00:28:30.888 sys 0m2.225s 00:28:30.888 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:30.888 12:09:44 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:30.888 ************************************ 00:28:30.888 END TEST raid_state_function_test_sb_4k 00:28:30.888 ************************************ 00:28:31.146 12:09:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:31.146 12:09:44 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:28:31.147 12:09:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:28:31.147 12:09:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:31.147 12:09:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:31.147 ************************************ 00:28:31.147 START TEST raid_superblock_test_4k 00:28:31.147 ************************************ 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=1598756 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 1598756 /var/tmp/spdk-raid.sock 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@829 -- # '[' -z 1598756 ']' 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:31.147 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:31.147 12:09:44 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:31.147 [2024-07-15 12:09:44.611206] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:28:31.147 [2024-07-15 12:09:44.611275] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1598756 ] 00:28:31.147 [2024-07-15 12:09:44.740510] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:31.406 [2024-07-15 12:09:44.842482] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:31.406 [2024-07-15 12:09:44.899065] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:31.406 [2024-07-15 12:09:44.899092] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@862 -- # return 0 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:32.343 12:09:45 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:28:32.602 malloc1 00:28:32.602 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:33.171 [2024-07-15 12:09:46.554957] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:33.171 [2024-07-15 12:09:46.555008] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:33.171 [2024-07-15 12:09:46.555029] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xebd560 00:28:33.171 [2024-07-15 12:09:46.555042] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:33.171 [2024-07-15 12:09:46.556826] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:33.171 [2024-07-15 12:09:46.556856] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:33.171 pt1 00:28:33.171 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:33.171 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:33.171 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:28:33.171 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:28:33.171 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:28:33.171 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:28:33.171 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:28:33.171 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:28:33.171 12:09:46 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:28:33.739 malloc2 00:28:33.739 12:09:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:33.998 [2024-07-15 12:09:47.590322] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:33.998 [2024-07-15 12:09:47.590370] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:33.998 [2024-07-15 12:09:47.590388] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf5b5b0 00:28:33.998 [2024-07-15 12:09:47.590400] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:33.998 [2024-07-15 12:09:47.591926] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:33.998 [2024-07-15 12:09:47.591955] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:34.257 pt2 00:28:34.257 12:09:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:28:34.257 12:09:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:28:34.258 12:09:47 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:28:34.516 [2024-07-15 12:09:48.103676] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:34.516 [2024-07-15 12:09:48.105029] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:34.516 [2024-07-15 12:09:48.105173] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf5cdb0 00:28:34.516 [2024-07-15 12:09:48.105186] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:34.516 [2024-07-15 12:09:48.105384] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf5dce0 00:28:34.516 [2024-07-15 12:09:48.105524] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf5cdb0 00:28:34.516 [2024-07-15 12:09:48.105534] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf5cdb0 00:28:34.516 [2024-07-15 12:09:48.105631] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:34.774 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:34.775 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:35.033 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:35.033 "name": "raid_bdev1", 00:28:35.033 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:35.033 "strip_size_kb": 0, 00:28:35.033 "state": "online", 00:28:35.033 "raid_level": "raid1", 00:28:35.033 "superblock": true, 00:28:35.033 "num_base_bdevs": 2, 00:28:35.033 "num_base_bdevs_discovered": 2, 00:28:35.033 "num_base_bdevs_operational": 2, 00:28:35.033 "base_bdevs_list": [ 00:28:35.033 { 00:28:35.033 "name": "pt1", 00:28:35.033 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:35.033 "is_configured": true, 00:28:35.033 "data_offset": 256, 00:28:35.033 "data_size": 7936 00:28:35.033 }, 00:28:35.033 { 00:28:35.033 "name": "pt2", 00:28:35.033 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:35.033 "is_configured": true, 00:28:35.033 "data_offset": 256, 00:28:35.033 "data_size": 7936 00:28:35.033 } 00:28:35.033 ] 00:28:35.033 }' 00:28:35.033 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:35.033 12:09:48 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:35.601 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:28:35.601 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:35.601 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:35.601 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:35.601 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:35.601 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:35.601 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:35.601 12:09:48 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:35.601 [2024-07-15 12:09:49.150641] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:35.601 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:35.601 "name": "raid_bdev1", 00:28:35.601 "aliases": [ 00:28:35.601 "134c5e93-a1a8-4437-b190-c66c48d8c3b7" 00:28:35.601 ], 00:28:35.601 "product_name": "Raid Volume", 00:28:35.601 "block_size": 4096, 00:28:35.601 "num_blocks": 7936, 00:28:35.601 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:35.601 "assigned_rate_limits": { 00:28:35.601 "rw_ios_per_sec": 0, 00:28:35.601 "rw_mbytes_per_sec": 0, 00:28:35.601 "r_mbytes_per_sec": 0, 00:28:35.601 "w_mbytes_per_sec": 0 00:28:35.601 }, 00:28:35.601 "claimed": false, 00:28:35.601 "zoned": false, 00:28:35.601 "supported_io_types": { 00:28:35.601 "read": true, 00:28:35.601 "write": true, 00:28:35.601 "unmap": false, 00:28:35.601 "flush": false, 00:28:35.601 "reset": true, 00:28:35.601 "nvme_admin": false, 00:28:35.601 "nvme_io": false, 00:28:35.601 "nvme_io_md": false, 00:28:35.601 "write_zeroes": true, 00:28:35.601 "zcopy": false, 00:28:35.601 "get_zone_info": false, 00:28:35.601 "zone_management": false, 00:28:35.601 "zone_append": false, 00:28:35.601 "compare": false, 00:28:35.601 "compare_and_write": false, 00:28:35.601 "abort": false, 00:28:35.601 "seek_hole": false, 00:28:35.601 "seek_data": false, 00:28:35.601 "copy": false, 00:28:35.601 "nvme_iov_md": false 00:28:35.601 }, 00:28:35.601 "memory_domains": [ 00:28:35.601 { 00:28:35.601 "dma_device_id": "system", 00:28:35.601 "dma_device_type": 1 00:28:35.601 }, 00:28:35.601 { 00:28:35.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:35.601 "dma_device_type": 2 00:28:35.601 }, 00:28:35.601 { 00:28:35.601 "dma_device_id": "system", 00:28:35.601 "dma_device_type": 1 00:28:35.601 }, 00:28:35.601 { 00:28:35.601 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:35.601 "dma_device_type": 2 00:28:35.601 } 00:28:35.601 ], 00:28:35.601 "driver_specific": { 00:28:35.601 "raid": { 00:28:35.601 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:35.601 "strip_size_kb": 0, 00:28:35.601 "state": "online", 00:28:35.601 "raid_level": "raid1", 00:28:35.601 "superblock": true, 00:28:35.601 "num_base_bdevs": 2, 00:28:35.601 "num_base_bdevs_discovered": 2, 00:28:35.601 "num_base_bdevs_operational": 2, 00:28:35.601 "base_bdevs_list": [ 00:28:35.601 { 00:28:35.601 "name": "pt1", 00:28:35.601 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:35.601 "is_configured": true, 00:28:35.601 "data_offset": 256, 00:28:35.601 "data_size": 7936 00:28:35.601 }, 00:28:35.601 { 00:28:35.601 "name": "pt2", 00:28:35.601 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:35.601 "is_configured": true, 00:28:35.602 "data_offset": 256, 00:28:35.602 "data_size": 7936 00:28:35.602 } 00:28:35.602 ] 00:28:35.602 } 00:28:35.602 } 00:28:35.602 }' 00:28:35.602 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:35.860 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:35.860 pt2' 00:28:35.860 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:35.860 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:35.860 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:36.119 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:36.119 "name": "pt1", 00:28:36.119 "aliases": [ 00:28:36.119 "00000000-0000-0000-0000-000000000001" 00:28:36.119 ], 00:28:36.119 "product_name": "passthru", 00:28:36.119 "block_size": 4096, 00:28:36.119 "num_blocks": 8192, 00:28:36.119 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:36.119 "assigned_rate_limits": { 00:28:36.119 "rw_ios_per_sec": 0, 00:28:36.119 "rw_mbytes_per_sec": 0, 00:28:36.119 "r_mbytes_per_sec": 0, 00:28:36.119 "w_mbytes_per_sec": 0 00:28:36.119 }, 00:28:36.119 "claimed": true, 00:28:36.119 "claim_type": "exclusive_write", 00:28:36.119 "zoned": false, 00:28:36.119 "supported_io_types": { 00:28:36.119 "read": true, 00:28:36.119 "write": true, 00:28:36.119 "unmap": true, 00:28:36.119 "flush": true, 00:28:36.119 "reset": true, 00:28:36.119 "nvme_admin": false, 00:28:36.119 "nvme_io": false, 00:28:36.119 "nvme_io_md": false, 00:28:36.119 "write_zeroes": true, 00:28:36.119 "zcopy": true, 00:28:36.119 "get_zone_info": false, 00:28:36.119 "zone_management": false, 00:28:36.119 "zone_append": false, 00:28:36.119 "compare": false, 00:28:36.119 "compare_and_write": false, 00:28:36.119 "abort": true, 00:28:36.119 "seek_hole": false, 00:28:36.119 "seek_data": false, 00:28:36.119 "copy": true, 00:28:36.119 "nvme_iov_md": false 00:28:36.119 }, 00:28:36.119 "memory_domains": [ 00:28:36.119 { 00:28:36.119 "dma_device_id": "system", 00:28:36.119 "dma_device_type": 1 00:28:36.119 }, 00:28:36.119 { 00:28:36.119 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:36.119 "dma_device_type": 2 00:28:36.119 } 00:28:36.119 ], 00:28:36.119 "driver_specific": { 00:28:36.119 "passthru": { 00:28:36.119 "name": "pt1", 00:28:36.119 "base_bdev_name": "malloc1" 00:28:36.119 } 00:28:36.119 } 00:28:36.119 }' 00:28:36.119 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:36.119 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:36.119 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:36.119 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:36.119 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:36.119 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:36.119 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:36.119 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:36.378 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:36.378 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:36.378 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:36.378 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:36.378 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:36.378 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:36.378 12:09:49 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:36.635 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:36.635 "name": "pt2", 00:28:36.635 "aliases": [ 00:28:36.635 "00000000-0000-0000-0000-000000000002" 00:28:36.635 ], 00:28:36.635 "product_name": "passthru", 00:28:36.635 "block_size": 4096, 00:28:36.635 "num_blocks": 8192, 00:28:36.635 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:36.635 "assigned_rate_limits": { 00:28:36.635 "rw_ios_per_sec": 0, 00:28:36.635 "rw_mbytes_per_sec": 0, 00:28:36.635 "r_mbytes_per_sec": 0, 00:28:36.635 "w_mbytes_per_sec": 0 00:28:36.635 }, 00:28:36.635 "claimed": true, 00:28:36.635 "claim_type": "exclusive_write", 00:28:36.635 "zoned": false, 00:28:36.635 "supported_io_types": { 00:28:36.635 "read": true, 00:28:36.635 "write": true, 00:28:36.635 "unmap": true, 00:28:36.635 "flush": true, 00:28:36.635 "reset": true, 00:28:36.635 "nvme_admin": false, 00:28:36.635 "nvme_io": false, 00:28:36.635 "nvme_io_md": false, 00:28:36.635 "write_zeroes": true, 00:28:36.635 "zcopy": true, 00:28:36.635 "get_zone_info": false, 00:28:36.635 "zone_management": false, 00:28:36.635 "zone_append": false, 00:28:36.635 "compare": false, 00:28:36.635 "compare_and_write": false, 00:28:36.635 "abort": true, 00:28:36.635 "seek_hole": false, 00:28:36.635 "seek_data": false, 00:28:36.635 "copy": true, 00:28:36.635 "nvme_iov_md": false 00:28:36.635 }, 00:28:36.635 "memory_domains": [ 00:28:36.635 { 00:28:36.635 "dma_device_id": "system", 00:28:36.635 "dma_device_type": 1 00:28:36.635 }, 00:28:36.635 { 00:28:36.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:36.635 "dma_device_type": 2 00:28:36.635 } 00:28:36.635 ], 00:28:36.635 "driver_specific": { 00:28:36.635 "passthru": { 00:28:36.635 "name": "pt2", 00:28:36.635 "base_bdev_name": "malloc2" 00:28:36.635 } 00:28:36.635 } 00:28:36.635 }' 00:28:36.635 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:36.635 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:36.635 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:36.635 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:36.893 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:28:37.151 [2024-07-15 12:09:50.710786] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:37.151 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=134c5e93-a1a8-4437-b190-c66c48d8c3b7 00:28:37.151 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z 134c5e93-a1a8-4437-b190-c66c48d8c3b7 ']' 00:28:37.151 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:37.410 [2024-07-15 12:09:50.955169] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:37.410 [2024-07-15 12:09:50.955190] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:37.410 [2024-07-15 12:09:50.955247] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:37.410 [2024-07-15 12:09:50.955302] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:37.410 [2024-07-15 12:09:50.955313] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf5cdb0 name raid_bdev1, state offline 00:28:37.410 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:37.410 12:09:50 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:28:37.668 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:28:37.668 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:28:37.668 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:37.668 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:37.926 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:28:37.926 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:38.186 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:28:38.186 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:28:38.444 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:28:38.444 12:09:51 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:38.444 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@648 -- # local es=0 00:28:38.444 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:38.444 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:38.444 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:38.444 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:38.444 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:38.444 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:38.445 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:28:38.445 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:38.445 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:38.445 12:09:51 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:28:38.704 [2024-07-15 12:09:52.186358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:28:38.704 [2024-07-15 12:09:52.187716] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:28:38.704 [2024-07-15 12:09:52.187773] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:28:38.704 [2024-07-15 12:09:52.187811] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:28:38.704 [2024-07-15 12:09:52.187830] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:38.704 [2024-07-15 12:09:52.187840] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf5e030 name raid_bdev1, state configuring 00:28:38.704 request: 00:28:38.704 { 00:28:38.704 "name": "raid_bdev1", 00:28:38.704 "raid_level": "raid1", 00:28:38.704 "base_bdevs": [ 00:28:38.704 "malloc1", 00:28:38.704 "malloc2" 00:28:38.704 ], 00:28:38.704 "superblock": false, 00:28:38.704 "method": "bdev_raid_create", 00:28:38.704 "req_id": 1 00:28:38.704 } 00:28:38.704 Got JSON-RPC error response 00:28:38.704 response: 00:28:38.704 { 00:28:38.704 "code": -17, 00:28:38.704 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:28:38.704 } 00:28:38.704 12:09:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # es=1 00:28:38.704 12:09:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:28:38.704 12:09:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:28:38.704 12:09:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:28:38.704 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:38.704 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:28:38.963 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:28:38.963 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:28:38.963 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:39.232 [2024-07-15 12:09:52.679622] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:39.232 [2024-07-15 12:09:52.679665] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:39.232 [2024-07-15 12:09:52.679687] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf5dae0 00:28:39.232 [2024-07-15 12:09:52.679701] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:39.232 [2024-07-15 12:09:52.681319] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:39.232 [2024-07-15 12:09:52.681347] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:39.232 [2024-07-15 12:09:52.681410] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:39.232 [2024-07-15 12:09:52.681433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:39.232 pt1 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:39.232 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:39.573 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:39.573 "name": "raid_bdev1", 00:28:39.573 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:39.573 "strip_size_kb": 0, 00:28:39.573 "state": "configuring", 00:28:39.573 "raid_level": "raid1", 00:28:39.573 "superblock": true, 00:28:39.573 "num_base_bdevs": 2, 00:28:39.573 "num_base_bdevs_discovered": 1, 00:28:39.573 "num_base_bdevs_operational": 2, 00:28:39.573 "base_bdevs_list": [ 00:28:39.573 { 00:28:39.573 "name": "pt1", 00:28:39.573 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:39.573 "is_configured": true, 00:28:39.573 "data_offset": 256, 00:28:39.573 "data_size": 7936 00:28:39.573 }, 00:28:39.573 { 00:28:39.573 "name": null, 00:28:39.573 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:39.573 "is_configured": false, 00:28:39.573 "data_offset": 256, 00:28:39.573 "data_size": 7936 00:28:39.573 } 00:28:39.573 ] 00:28:39.573 }' 00:28:39.573 12:09:52 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:39.573 12:09:52 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:40.141 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:28:40.141 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:28:40.141 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:40.141 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:40.400 [2024-07-15 12:09:53.782558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:40.400 [2024-07-15 12:09:53.782606] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:40.400 [2024-07-15 12:09:53.782623] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xebc940 00:28:40.400 [2024-07-15 12:09:53.782635] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:40.400 [2024-07-15 12:09:53.782974] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:40.400 [2024-07-15 12:09:53.782993] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:40.400 [2024-07-15 12:09:53.783050] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:40.400 [2024-07-15 12:09:53.783068] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:40.400 [2024-07-15 12:09:53.783163] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xebf2c0 00:28:40.400 [2024-07-15 12:09:53.783173] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:40.400 [2024-07-15 12:09:53.783340] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xec0100 00:28:40.400 [2024-07-15 12:09:53.783463] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xebf2c0 00:28:40.400 [2024-07-15 12:09:53.783473] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xebf2c0 00:28:40.400 [2024-07-15 12:09:53.783571] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:40.400 pt2 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:40.400 12:09:53 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:40.659 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:40.659 "name": "raid_bdev1", 00:28:40.659 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:40.659 "strip_size_kb": 0, 00:28:40.659 "state": "online", 00:28:40.659 "raid_level": "raid1", 00:28:40.659 "superblock": true, 00:28:40.659 "num_base_bdevs": 2, 00:28:40.659 "num_base_bdevs_discovered": 2, 00:28:40.659 "num_base_bdevs_operational": 2, 00:28:40.659 "base_bdevs_list": [ 00:28:40.659 { 00:28:40.659 "name": "pt1", 00:28:40.659 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:40.659 "is_configured": true, 00:28:40.659 "data_offset": 256, 00:28:40.659 "data_size": 7936 00:28:40.659 }, 00:28:40.659 { 00:28:40.659 "name": "pt2", 00:28:40.659 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:40.659 "is_configured": true, 00:28:40.659 "data_offset": 256, 00:28:40.659 "data_size": 7936 00:28:40.659 } 00:28:40.659 ] 00:28:40.659 }' 00:28:40.659 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:40.659 12:09:54 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:41.228 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:28:41.228 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:28:41.228 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:28:41.228 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:28:41.228 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:28:41.228 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:28:41.228 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:28:41.228 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:41.487 [2024-07-15 12:09:54.877709] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:41.487 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:28:41.487 "name": "raid_bdev1", 00:28:41.487 "aliases": [ 00:28:41.487 "134c5e93-a1a8-4437-b190-c66c48d8c3b7" 00:28:41.487 ], 00:28:41.487 "product_name": "Raid Volume", 00:28:41.487 "block_size": 4096, 00:28:41.487 "num_blocks": 7936, 00:28:41.487 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:41.487 "assigned_rate_limits": { 00:28:41.487 "rw_ios_per_sec": 0, 00:28:41.487 "rw_mbytes_per_sec": 0, 00:28:41.487 "r_mbytes_per_sec": 0, 00:28:41.487 "w_mbytes_per_sec": 0 00:28:41.487 }, 00:28:41.487 "claimed": false, 00:28:41.487 "zoned": false, 00:28:41.487 "supported_io_types": { 00:28:41.487 "read": true, 00:28:41.487 "write": true, 00:28:41.488 "unmap": false, 00:28:41.488 "flush": false, 00:28:41.488 "reset": true, 00:28:41.488 "nvme_admin": false, 00:28:41.488 "nvme_io": false, 00:28:41.488 "nvme_io_md": false, 00:28:41.488 "write_zeroes": true, 00:28:41.488 "zcopy": false, 00:28:41.488 "get_zone_info": false, 00:28:41.488 "zone_management": false, 00:28:41.488 "zone_append": false, 00:28:41.488 "compare": false, 00:28:41.488 "compare_and_write": false, 00:28:41.488 "abort": false, 00:28:41.488 "seek_hole": false, 00:28:41.488 "seek_data": false, 00:28:41.488 "copy": false, 00:28:41.488 "nvme_iov_md": false 00:28:41.488 }, 00:28:41.488 "memory_domains": [ 00:28:41.488 { 00:28:41.488 "dma_device_id": "system", 00:28:41.488 "dma_device_type": 1 00:28:41.488 }, 00:28:41.488 { 00:28:41.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:41.488 "dma_device_type": 2 00:28:41.488 }, 00:28:41.488 { 00:28:41.488 "dma_device_id": "system", 00:28:41.488 "dma_device_type": 1 00:28:41.488 }, 00:28:41.488 { 00:28:41.488 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:41.488 "dma_device_type": 2 00:28:41.488 } 00:28:41.488 ], 00:28:41.488 "driver_specific": { 00:28:41.488 "raid": { 00:28:41.488 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:41.488 "strip_size_kb": 0, 00:28:41.488 "state": "online", 00:28:41.488 "raid_level": "raid1", 00:28:41.488 "superblock": true, 00:28:41.488 "num_base_bdevs": 2, 00:28:41.488 "num_base_bdevs_discovered": 2, 00:28:41.488 "num_base_bdevs_operational": 2, 00:28:41.488 "base_bdevs_list": [ 00:28:41.488 { 00:28:41.488 "name": "pt1", 00:28:41.488 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:41.488 "is_configured": true, 00:28:41.488 "data_offset": 256, 00:28:41.488 "data_size": 7936 00:28:41.488 }, 00:28:41.488 { 00:28:41.488 "name": "pt2", 00:28:41.488 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:41.488 "is_configured": true, 00:28:41.488 "data_offset": 256, 00:28:41.488 "data_size": 7936 00:28:41.488 } 00:28:41.488 ] 00:28:41.488 } 00:28:41.488 } 00:28:41.488 }' 00:28:41.488 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:28:41.488 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:28:41.488 pt2' 00:28:41.488 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:41.488 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:28:41.488 12:09:54 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:41.748 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:41.748 "name": "pt1", 00:28:41.748 "aliases": [ 00:28:41.748 "00000000-0000-0000-0000-000000000001" 00:28:41.748 ], 00:28:41.748 "product_name": "passthru", 00:28:41.748 "block_size": 4096, 00:28:41.748 "num_blocks": 8192, 00:28:41.748 "uuid": "00000000-0000-0000-0000-000000000001", 00:28:41.748 "assigned_rate_limits": { 00:28:41.748 "rw_ios_per_sec": 0, 00:28:41.748 "rw_mbytes_per_sec": 0, 00:28:41.748 "r_mbytes_per_sec": 0, 00:28:41.748 "w_mbytes_per_sec": 0 00:28:41.748 }, 00:28:41.748 "claimed": true, 00:28:41.748 "claim_type": "exclusive_write", 00:28:41.748 "zoned": false, 00:28:41.748 "supported_io_types": { 00:28:41.748 "read": true, 00:28:41.748 "write": true, 00:28:41.748 "unmap": true, 00:28:41.748 "flush": true, 00:28:41.748 "reset": true, 00:28:41.748 "nvme_admin": false, 00:28:41.748 "nvme_io": false, 00:28:41.748 "nvme_io_md": false, 00:28:41.748 "write_zeroes": true, 00:28:41.748 "zcopy": true, 00:28:41.748 "get_zone_info": false, 00:28:41.748 "zone_management": false, 00:28:41.748 "zone_append": false, 00:28:41.748 "compare": false, 00:28:41.748 "compare_and_write": false, 00:28:41.748 "abort": true, 00:28:41.748 "seek_hole": false, 00:28:41.748 "seek_data": false, 00:28:41.748 "copy": true, 00:28:41.748 "nvme_iov_md": false 00:28:41.748 }, 00:28:41.748 "memory_domains": [ 00:28:41.748 { 00:28:41.748 "dma_device_id": "system", 00:28:41.748 "dma_device_type": 1 00:28:41.748 }, 00:28:41.748 { 00:28:41.748 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:41.748 "dma_device_type": 2 00:28:41.748 } 00:28:41.748 ], 00:28:41.748 "driver_specific": { 00:28:41.748 "passthru": { 00:28:41.748 "name": "pt1", 00:28:41.748 "base_bdev_name": "malloc1" 00:28:41.748 } 00:28:41.748 } 00:28:41.748 }' 00:28:41.748 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:41.748 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:41.748 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:41.748 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:41.748 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:28:42.007 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:28:42.266 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:28:42.266 "name": "pt2", 00:28:42.266 "aliases": [ 00:28:42.266 "00000000-0000-0000-0000-000000000002" 00:28:42.266 ], 00:28:42.266 "product_name": "passthru", 00:28:42.266 "block_size": 4096, 00:28:42.266 "num_blocks": 8192, 00:28:42.266 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:42.266 "assigned_rate_limits": { 00:28:42.266 "rw_ios_per_sec": 0, 00:28:42.266 "rw_mbytes_per_sec": 0, 00:28:42.266 "r_mbytes_per_sec": 0, 00:28:42.266 "w_mbytes_per_sec": 0 00:28:42.266 }, 00:28:42.266 "claimed": true, 00:28:42.266 "claim_type": "exclusive_write", 00:28:42.266 "zoned": false, 00:28:42.266 "supported_io_types": { 00:28:42.266 "read": true, 00:28:42.266 "write": true, 00:28:42.266 "unmap": true, 00:28:42.266 "flush": true, 00:28:42.266 "reset": true, 00:28:42.266 "nvme_admin": false, 00:28:42.266 "nvme_io": false, 00:28:42.266 "nvme_io_md": false, 00:28:42.266 "write_zeroes": true, 00:28:42.266 "zcopy": true, 00:28:42.266 "get_zone_info": false, 00:28:42.266 "zone_management": false, 00:28:42.266 "zone_append": false, 00:28:42.266 "compare": false, 00:28:42.266 "compare_and_write": false, 00:28:42.266 "abort": true, 00:28:42.266 "seek_hole": false, 00:28:42.266 "seek_data": false, 00:28:42.266 "copy": true, 00:28:42.266 "nvme_iov_md": false 00:28:42.266 }, 00:28:42.266 "memory_domains": [ 00:28:42.266 { 00:28:42.266 "dma_device_id": "system", 00:28:42.266 "dma_device_type": 1 00:28:42.266 }, 00:28:42.266 { 00:28:42.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:28:42.266 "dma_device_type": 2 00:28:42.266 } 00:28:42.266 ], 00:28:42.266 "driver_specific": { 00:28:42.266 "passthru": { 00:28:42.266 "name": "pt2", 00:28:42.266 "base_bdev_name": "malloc2" 00:28:42.266 } 00:28:42.266 } 00:28:42.266 }' 00:28:42.266 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:42.266 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:28:42.524 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:28:42.524 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:42.524 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:28:42.524 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:28:42.524 12:09:55 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:42.524 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:28:42.524 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:28:42.524 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:42.524 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:28:42.783 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:28:42.783 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:42.783 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:28:43.041 [2024-07-15 12:09:56.385703] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:43.041 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' 134c5e93-a1a8-4437-b190-c66c48d8c3b7 '!=' 134c5e93-a1a8-4437-b190-c66c48d8c3b7 ']' 00:28:43.041 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:28:43.041 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:28:43.041 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:28:43.042 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:28:43.301 [2024-07-15 12:09:56.638175] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:43.301 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:43.561 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:43.561 "name": "raid_bdev1", 00:28:43.561 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:43.561 "strip_size_kb": 0, 00:28:43.561 "state": "online", 00:28:43.561 "raid_level": "raid1", 00:28:43.561 "superblock": true, 00:28:43.561 "num_base_bdevs": 2, 00:28:43.561 "num_base_bdevs_discovered": 1, 00:28:43.561 "num_base_bdevs_operational": 1, 00:28:43.561 "base_bdevs_list": [ 00:28:43.561 { 00:28:43.561 "name": null, 00:28:43.561 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:43.561 "is_configured": false, 00:28:43.561 "data_offset": 256, 00:28:43.561 "data_size": 7936 00:28:43.561 }, 00:28:43.561 { 00:28:43.561 "name": "pt2", 00:28:43.561 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:43.561 "is_configured": true, 00:28:43.561 "data_offset": 256, 00:28:43.561 "data_size": 7936 00:28:43.561 } 00:28:43.561 ] 00:28:43.561 }' 00:28:43.561 12:09:56 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:43.561 12:09:56 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:44.129 12:09:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:44.388 [2024-07-15 12:09:57.765231] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:44.388 [2024-07-15 12:09:57.765258] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:44.388 [2024-07-15 12:09:57.765317] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:44.388 [2024-07-15 12:09:57.765358] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:44.388 [2024-07-15 12:09:57.765369] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xebf2c0 name raid_bdev1, state offline 00:28:44.388 12:09:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:44.388 12:09:57 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:28:44.648 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:28:44.648 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:28:44.648 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:28:44.648 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:44.648 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:28:44.908 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:28:44.908 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:28:44.908 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:28:44.908 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:28:44.908 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:28:44.908 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:28:44.908 [2024-07-15 12:09:58.495129] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:28:44.908 [2024-07-15 12:09:58.495171] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:44.908 [2024-07-15 12:09:58.495187] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xebc5a0 00:28:44.908 [2024-07-15 12:09:58.495200] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:44.908 [2024-07-15 12:09:58.496778] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:44.908 [2024-07-15 12:09:58.496807] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:28:44.908 [2024-07-15 12:09:58.496872] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:28:44.908 [2024-07-15 12:09:58.496896] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:44.908 [2024-07-15 12:09:58.496977] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xebfe10 00:28:44.908 [2024-07-15 12:09:58.496987] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:44.908 [2024-07-15 12:09:58.497161] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf5e950 00:28:44.908 [2024-07-15 12:09:58.497280] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xebfe10 00:28:44.908 [2024-07-15 12:09:58.497290] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xebfe10 00:28:44.908 [2024-07-15 12:09:58.497383] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:44.908 pt2 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:45.167 "name": "raid_bdev1", 00:28:45.167 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:45.167 "strip_size_kb": 0, 00:28:45.167 "state": "online", 00:28:45.167 "raid_level": "raid1", 00:28:45.167 "superblock": true, 00:28:45.167 "num_base_bdevs": 2, 00:28:45.167 "num_base_bdevs_discovered": 1, 00:28:45.167 "num_base_bdevs_operational": 1, 00:28:45.167 "base_bdevs_list": [ 00:28:45.167 { 00:28:45.167 "name": null, 00:28:45.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:45.167 "is_configured": false, 00:28:45.167 "data_offset": 256, 00:28:45.167 "data_size": 7936 00:28:45.167 }, 00:28:45.167 { 00:28:45.167 "name": "pt2", 00:28:45.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:45.167 "is_configured": true, 00:28:45.167 "data_offset": 256, 00:28:45.167 "data_size": 7936 00:28:45.167 } 00:28:45.167 ] 00:28:45.167 }' 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:45.167 12:09:58 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:46.105 12:09:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:46.105 [2024-07-15 12:09:59.598039] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:46.105 [2024-07-15 12:09:59.598076] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:46.105 [2024-07-15 12:09:59.598129] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:46.105 [2024-07-15 12:09:59.598173] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:46.105 [2024-07-15 12:09:59.598184] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xebfe10 name raid_bdev1, state offline 00:28:46.105 12:09:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.105 12:09:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:28:46.364 12:09:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:28:46.364 12:09:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:28:46.364 12:09:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:28:46.364 12:09:59 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:28:46.623 [2024-07-15 12:10:00.099361] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:28:46.623 [2024-07-15 12:10:00.099414] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:46.623 [2024-07-15 12:10:00.099433] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf5b7e0 00:28:46.623 [2024-07-15 12:10:00.099446] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:46.623 [2024-07-15 12:10:00.101041] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:46.623 [2024-07-15 12:10:00.101070] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:28:46.623 [2024-07-15 12:10:00.101137] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:28:46.623 [2024-07-15 12:10:00.101161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:28:46.623 [2024-07-15 12:10:00.101261] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:28:46.623 [2024-07-15 12:10:00.101275] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:46.623 [2024-07-15 12:10:00.101289] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf60a00 name raid_bdev1, state configuring 00:28:46.623 [2024-07-15 12:10:00.101312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:28:46.623 [2024-07-15 12:10:00.101369] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf5f760 00:28:46.623 [2024-07-15 12:10:00.101380] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:46.623 [2024-07-15 12:10:00.101545] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf60f30 00:28:46.623 [2024-07-15 12:10:00.101662] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf5f760 00:28:46.623 [2024-07-15 12:10:00.101672] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf5f760 00:28:46.623 [2024-07-15 12:10:00.101780] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:46.623 pt1 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:46.623 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:46.624 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:46.883 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:46.883 "name": "raid_bdev1", 00:28:46.883 "uuid": "134c5e93-a1a8-4437-b190-c66c48d8c3b7", 00:28:46.883 "strip_size_kb": 0, 00:28:46.883 "state": "online", 00:28:46.883 "raid_level": "raid1", 00:28:46.883 "superblock": true, 00:28:46.883 "num_base_bdevs": 2, 00:28:46.883 "num_base_bdevs_discovered": 1, 00:28:46.883 "num_base_bdevs_operational": 1, 00:28:46.883 "base_bdevs_list": [ 00:28:46.883 { 00:28:46.883 "name": null, 00:28:46.883 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:46.883 "is_configured": false, 00:28:46.883 "data_offset": 256, 00:28:46.883 "data_size": 7936 00:28:46.883 }, 00:28:46.883 { 00:28:46.883 "name": "pt2", 00:28:46.883 "uuid": "00000000-0000-0000-0000-000000000002", 00:28:46.883 "is_configured": true, 00:28:46.883 "data_offset": 256, 00:28:46.883 "data_size": 7936 00:28:46.883 } 00:28:46.883 ] 00:28:46.883 }' 00:28:46.883 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:46.883 12:10:00 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:47.451 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:28:47.451 12:10:00 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:28:47.710 12:10:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:28:47.710 12:10:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:28:47.710 12:10:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:47.969 [2024-07-15 12:10:01.374931] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 134c5e93-a1a8-4437-b190-c66c48d8c3b7 '!=' 134c5e93-a1a8-4437-b190-c66c48d8c3b7 ']' 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 1598756 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@948 -- # '[' -z 1598756 ']' 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@952 -- # kill -0 1598756 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # uname 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1598756 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1598756' 00:28:47.969 killing process with pid 1598756 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # kill 1598756 00:28:47.969 [2024-07-15 12:10:01.449072] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:47.969 [2024-07-15 12:10:01.449129] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:47.969 [2024-07-15 12:10:01.449172] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:47.969 [2024-07-15 12:10:01.449183] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf5f760 name raid_bdev1, state offline 00:28:47.969 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@972 -- # wait 1598756 00:28:47.969 [2024-07-15 12:10:01.468331] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:48.229 12:10:01 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:28:48.229 00:28:48.229 real 0m17.140s 00:28:48.229 user 0m31.080s 00:28:48.229 sys 0m3.148s 00:28:48.229 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:28:48.229 12:10:01 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:28:48.229 ************************************ 00:28:48.229 END TEST raid_superblock_test_4k 00:28:48.229 ************************************ 00:28:48.229 12:10:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:28:48.229 12:10:01 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:28:48.229 12:10:01 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:28:48.229 12:10:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:28:48.229 12:10:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:28:48.229 12:10:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:48.229 ************************************ 00:28:48.229 START TEST raid_rebuild_test_sb_4k 00:28:48.229 ************************************ 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=1601339 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 1601339 /var/tmp/spdk-raid.sock 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@829 -- # '[' -z 1601339 ']' 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local max_retries=100 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:28:48.229 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # xtrace_disable 00:28:48.229 12:10:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:48.488 [2024-07-15 12:10:01.849732] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:28:48.488 [2024-07-15 12:10:01.849799] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1601339 ] 00:28:48.488 I/O size of 3145728 is greater than zero copy threshold (65536). 00:28:48.488 Zero copy mechanism will not be used. 00:28:48.488 [2024-07-15 12:10:01.968369] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:48.488 [2024-07-15 12:10:02.072451] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:48.748 [2024-07-15 12:10:02.143744] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:48.748 [2024-07-15 12:10:02.143801] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:28:49.316 12:10:02 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:28:49.316 12:10:02 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@862 -- # return 0 00:28:49.316 12:10:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:49.316 12:10:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:28:49.575 BaseBdev1_malloc 00:28:49.575 12:10:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:49.833 [2024-07-15 12:10:03.317639] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:49.833 [2024-07-15 12:10:03.317693] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:49.833 [2024-07-15 12:10:03.317718] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c29c0 00:28:49.833 [2024-07-15 12:10:03.317731] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:49.833 [2024-07-15 12:10:03.319550] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:49.833 [2024-07-15 12:10:03.319581] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:49.833 BaseBdev1 00:28:49.833 12:10:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:28:49.833 12:10:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:28:50.091 BaseBdev2_malloc 00:28:50.091 12:10:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:28:50.350 [2024-07-15 12:10:03.812635] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:28:50.350 [2024-07-15 12:10:03.812681] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:50.350 [2024-07-15 12:10:03.812905] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c3510 00:28:50.350 [2024-07-15 12:10:03.812918] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:50.350 [2024-07-15 12:10:03.814480] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:50.350 [2024-07-15 12:10:03.814508] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:28:50.350 BaseBdev2 00:28:50.350 12:10:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:28:50.609 spare_malloc 00:28:50.609 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:28:50.868 spare_delay 00:28:50.868 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:51.127 [2024-07-15 12:10:04.532313] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:51.127 [2024-07-15 12:10:04.532358] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:51.127 [2024-07-15 12:10:04.532378] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb6a520 00:28:51.127 [2024-07-15 12:10:04.532391] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:51.127 [2024-07-15 12:10:04.534035] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:51.127 [2024-07-15 12:10:04.534077] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:51.127 spare 00:28:51.127 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:28:51.386 [2024-07-15 12:10:04.776985] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:51.386 [2024-07-15 12:10:04.778291] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:51.386 [2024-07-15 12:10:04.778446] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb6bab0 00:28:51.386 [2024-07-15 12:10:04.778459] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:28:51.386 [2024-07-15 12:10:04.778656] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9c25c0 00:28:51.386 [2024-07-15 12:10:04.778805] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb6bab0 00:28:51.386 [2024-07-15 12:10:04.778815] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb6bab0 00:28:51.386 [2024-07-15 12:10:04.778912] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:51.386 12:10:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:51.645 12:10:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:51.645 "name": "raid_bdev1", 00:28:51.645 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:28:51.645 "strip_size_kb": 0, 00:28:51.645 "state": "online", 00:28:51.645 "raid_level": "raid1", 00:28:51.645 "superblock": true, 00:28:51.645 "num_base_bdevs": 2, 00:28:51.645 "num_base_bdevs_discovered": 2, 00:28:51.645 "num_base_bdevs_operational": 2, 00:28:51.645 "base_bdevs_list": [ 00:28:51.645 { 00:28:51.645 "name": "BaseBdev1", 00:28:51.645 "uuid": "ea1b4acb-4071-52bd-becc-61492bf6a1b6", 00:28:51.645 "is_configured": true, 00:28:51.645 "data_offset": 256, 00:28:51.645 "data_size": 7936 00:28:51.645 }, 00:28:51.645 { 00:28:51.645 "name": "BaseBdev2", 00:28:51.645 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:28:51.645 "is_configured": true, 00:28:51.645 "data_offset": 256, 00:28:51.645 "data_size": 7936 00:28:51.645 } 00:28:51.645 ] 00:28:51.645 }' 00:28:51.645 12:10:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:51.645 12:10:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:52.213 12:10:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:28:52.213 12:10:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:28:52.472 [2024-07-15 12:10:05.864085] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:28:52.472 12:10:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:28:52.472 12:10:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:52.472 12:10:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:53.041 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:28:53.300 [2024-07-15 12:10:06.637930] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9c25c0 00:28:53.300 /dev/nbd0 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:28:53.300 1+0 records in 00:28:53.300 1+0 records out 00:28:53.300 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002766 s, 14.8 MB/s 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:28:53.300 12:10:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:28:54.237 7936+0 records in 00:28:54.237 7936+0 records out 00:28:54.237 32505856 bytes (33 MB, 31 MiB) copied, 0.789483 s, 41.2 MB/s 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:28:54.237 [2024-07-15 12:10:07.759340] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:28:54.237 12:10:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:28:54.497 [2024-07-15 12:10:07.991991] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:54.497 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:54.757 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:54.757 "name": "raid_bdev1", 00:28:54.757 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:28:54.757 "strip_size_kb": 0, 00:28:54.757 "state": "online", 00:28:54.757 "raid_level": "raid1", 00:28:54.757 "superblock": true, 00:28:54.757 "num_base_bdevs": 2, 00:28:54.757 "num_base_bdevs_discovered": 1, 00:28:54.757 "num_base_bdevs_operational": 1, 00:28:54.757 "base_bdevs_list": [ 00:28:54.757 { 00:28:54.757 "name": null, 00:28:54.757 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:54.757 "is_configured": false, 00:28:54.757 "data_offset": 256, 00:28:54.757 "data_size": 7936 00:28:54.757 }, 00:28:54.757 { 00:28:54.757 "name": "BaseBdev2", 00:28:54.757 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:28:54.757 "is_configured": true, 00:28:54.757 "data_offset": 256, 00:28:54.757 "data_size": 7936 00:28:54.757 } 00:28:54.757 ] 00:28:54.757 }' 00:28:54.757 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:54.757 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:55.325 12:10:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:55.583 [2024-07-15 12:10:09.095035] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:55.583 [2024-07-15 12:10:09.099995] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x6ca230 00:28:55.583 [2024-07-15 12:10:09.102384] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:55.583 12:10:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:56.961 "name": "raid_bdev1", 00:28:56.961 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:28:56.961 "strip_size_kb": 0, 00:28:56.961 "state": "online", 00:28:56.961 "raid_level": "raid1", 00:28:56.961 "superblock": true, 00:28:56.961 "num_base_bdevs": 2, 00:28:56.961 "num_base_bdevs_discovered": 2, 00:28:56.961 "num_base_bdevs_operational": 2, 00:28:56.961 "process": { 00:28:56.961 "type": "rebuild", 00:28:56.961 "target": "spare", 00:28:56.961 "progress": { 00:28:56.961 "blocks": 3072, 00:28:56.961 "percent": 38 00:28:56.961 } 00:28:56.961 }, 00:28:56.961 "base_bdevs_list": [ 00:28:56.961 { 00:28:56.961 "name": "spare", 00:28:56.961 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:28:56.961 "is_configured": true, 00:28:56.961 "data_offset": 256, 00:28:56.961 "data_size": 7936 00:28:56.961 }, 00:28:56.961 { 00:28:56.961 "name": "BaseBdev2", 00:28:56.961 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:28:56.961 "is_configured": true, 00:28:56.961 "data_offset": 256, 00:28:56.961 "data_size": 7936 00:28:56.961 } 00:28:56.961 ] 00:28:56.961 }' 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:56.961 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:57.220 [2024-07-15 12:10:10.696340] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:57.220 [2024-07-15 12:10:10.715130] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:57.220 [2024-07-15 12:10:10.715171] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:57.220 [2024-07-15 12:10:10.715186] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:57.220 [2024-07-15 12:10:10.715194] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:57.220 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:57.480 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:57.480 "name": "raid_bdev1", 00:28:57.480 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:28:57.480 "strip_size_kb": 0, 00:28:57.480 "state": "online", 00:28:57.480 "raid_level": "raid1", 00:28:57.480 "superblock": true, 00:28:57.480 "num_base_bdevs": 2, 00:28:57.480 "num_base_bdevs_discovered": 1, 00:28:57.480 "num_base_bdevs_operational": 1, 00:28:57.480 "base_bdevs_list": [ 00:28:57.480 { 00:28:57.480 "name": null, 00:28:57.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:57.480 "is_configured": false, 00:28:57.480 "data_offset": 256, 00:28:57.480 "data_size": 7936 00:28:57.480 }, 00:28:57.480 { 00:28:57.480 "name": "BaseBdev2", 00:28:57.480 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:28:57.480 "is_configured": true, 00:28:57.480 "data_offset": 256, 00:28:57.480 "data_size": 7936 00:28:57.480 } 00:28:57.480 ] 00:28:57.480 }' 00:28:57.480 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:57.480 12:10:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:28:58.048 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:58.049 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:58.049 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:58.049 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:58.049 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:58.049 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:58.049 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:58.306 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:58.306 "name": "raid_bdev1", 00:28:58.306 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:28:58.306 "strip_size_kb": 0, 00:28:58.306 "state": "online", 00:28:58.306 "raid_level": "raid1", 00:28:58.306 "superblock": true, 00:28:58.306 "num_base_bdevs": 2, 00:28:58.306 "num_base_bdevs_discovered": 1, 00:28:58.306 "num_base_bdevs_operational": 1, 00:28:58.306 "base_bdevs_list": [ 00:28:58.306 { 00:28:58.306 "name": null, 00:28:58.306 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:58.306 "is_configured": false, 00:28:58.306 "data_offset": 256, 00:28:58.306 "data_size": 7936 00:28:58.306 }, 00:28:58.306 { 00:28:58.306 "name": "BaseBdev2", 00:28:58.306 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:28:58.307 "is_configured": true, 00:28:58.307 "data_offset": 256, 00:28:58.307 "data_size": 7936 00:28:58.307 } 00:28:58.307 ] 00:28:58.307 }' 00:28:58.307 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:58.307 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:58.307 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:58.564 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:58.564 12:10:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:58.564 [2024-07-15 12:10:12.160061] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:58.822 [2024-07-15 12:10:12.164962] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9c1fb0 00:28:58.822 [2024-07-15 12:10:12.166416] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:58.822 12:10:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:28:59.757 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:59.757 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:59.757 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:59.757 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:59.757 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:59.757 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:59.757 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:00.016 "name": "raid_bdev1", 00:29:00.016 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:00.016 "strip_size_kb": 0, 00:29:00.016 "state": "online", 00:29:00.016 "raid_level": "raid1", 00:29:00.016 "superblock": true, 00:29:00.016 "num_base_bdevs": 2, 00:29:00.016 "num_base_bdevs_discovered": 2, 00:29:00.016 "num_base_bdevs_operational": 2, 00:29:00.016 "process": { 00:29:00.016 "type": "rebuild", 00:29:00.016 "target": "spare", 00:29:00.016 "progress": { 00:29:00.016 "blocks": 3072, 00:29:00.016 "percent": 38 00:29:00.016 } 00:29:00.016 }, 00:29:00.016 "base_bdevs_list": [ 00:29:00.016 { 00:29:00.016 "name": "spare", 00:29:00.016 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:00.016 "is_configured": true, 00:29:00.016 "data_offset": 256, 00:29:00.016 "data_size": 7936 00:29:00.016 }, 00:29:00.016 { 00:29:00.016 "name": "BaseBdev2", 00:29:00.016 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:00.016 "is_configured": true, 00:29:00.016 "data_offset": 256, 00:29:00.016 "data_size": 7936 00:29:00.016 } 00:29:00.016 ] 00:29:00.016 }' 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:29:00.016 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1047 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:00.016 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:00.274 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:00.274 "name": "raid_bdev1", 00:29:00.274 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:00.274 "strip_size_kb": 0, 00:29:00.274 "state": "online", 00:29:00.274 "raid_level": "raid1", 00:29:00.274 "superblock": true, 00:29:00.274 "num_base_bdevs": 2, 00:29:00.274 "num_base_bdevs_discovered": 2, 00:29:00.274 "num_base_bdevs_operational": 2, 00:29:00.274 "process": { 00:29:00.274 "type": "rebuild", 00:29:00.274 "target": "spare", 00:29:00.274 "progress": { 00:29:00.274 "blocks": 3840, 00:29:00.274 "percent": 48 00:29:00.274 } 00:29:00.274 }, 00:29:00.275 "base_bdevs_list": [ 00:29:00.275 { 00:29:00.275 "name": "spare", 00:29:00.275 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:00.275 "is_configured": true, 00:29:00.275 "data_offset": 256, 00:29:00.275 "data_size": 7936 00:29:00.275 }, 00:29:00.275 { 00:29:00.275 "name": "BaseBdev2", 00:29:00.275 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:00.275 "is_configured": true, 00:29:00.275 "data_offset": 256, 00:29:00.275 "data_size": 7936 00:29:00.275 } 00:29:00.275 ] 00:29:00.275 }' 00:29:00.275 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:00.275 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:00.275 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:00.275 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:00.275 12:10:13 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:01.652 12:10:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:01.652 12:10:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:01.652 12:10:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:01.652 12:10:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:01.652 12:10:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:01.652 12:10:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:01.652 12:10:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:01.652 12:10:14 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:01.652 12:10:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:01.652 "name": "raid_bdev1", 00:29:01.652 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:01.652 "strip_size_kb": 0, 00:29:01.652 "state": "online", 00:29:01.652 "raid_level": "raid1", 00:29:01.652 "superblock": true, 00:29:01.652 "num_base_bdevs": 2, 00:29:01.652 "num_base_bdevs_discovered": 2, 00:29:01.652 "num_base_bdevs_operational": 2, 00:29:01.652 "process": { 00:29:01.652 "type": "rebuild", 00:29:01.652 "target": "spare", 00:29:01.652 "progress": { 00:29:01.652 "blocks": 7424, 00:29:01.652 "percent": 93 00:29:01.652 } 00:29:01.652 }, 00:29:01.652 "base_bdevs_list": [ 00:29:01.652 { 00:29:01.652 "name": "spare", 00:29:01.652 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:01.652 "is_configured": true, 00:29:01.652 "data_offset": 256, 00:29:01.652 "data_size": 7936 00:29:01.652 }, 00:29:01.652 { 00:29:01.652 "name": "BaseBdev2", 00:29:01.652 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:01.652 "is_configured": true, 00:29:01.652 "data_offset": 256, 00:29:01.652 "data_size": 7936 00:29:01.652 } 00:29:01.652 ] 00:29:01.652 }' 00:29:01.652 12:10:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:01.652 12:10:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:01.652 12:10:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:01.652 12:10:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:01.652 12:10:15 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:29:01.911 [2024-07-15 12:10:15.290538] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:29:01.911 [2024-07-15 12:10:15.290598] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:29:01.911 [2024-07-15 12:10:15.290676] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:02.846 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:29:02.846 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:02.847 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:02.847 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:02.847 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:02.847 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:02.847 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:02.847 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:03.106 "name": "raid_bdev1", 00:29:03.106 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:03.106 "strip_size_kb": 0, 00:29:03.106 "state": "online", 00:29:03.106 "raid_level": "raid1", 00:29:03.106 "superblock": true, 00:29:03.106 "num_base_bdevs": 2, 00:29:03.106 "num_base_bdevs_discovered": 2, 00:29:03.106 "num_base_bdevs_operational": 2, 00:29:03.106 "base_bdevs_list": [ 00:29:03.106 { 00:29:03.106 "name": "spare", 00:29:03.106 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:03.106 "is_configured": true, 00:29:03.106 "data_offset": 256, 00:29:03.106 "data_size": 7936 00:29:03.106 }, 00:29:03.106 { 00:29:03.106 "name": "BaseBdev2", 00:29:03.106 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:03.106 "is_configured": true, 00:29:03.106 "data_offset": 256, 00:29:03.106 "data_size": 7936 00:29:03.106 } 00:29:03.106 ] 00:29:03.106 }' 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.106 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:03.365 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:03.365 "name": "raid_bdev1", 00:29:03.365 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:03.365 "strip_size_kb": 0, 00:29:03.365 "state": "online", 00:29:03.366 "raid_level": "raid1", 00:29:03.366 "superblock": true, 00:29:03.366 "num_base_bdevs": 2, 00:29:03.366 "num_base_bdevs_discovered": 2, 00:29:03.366 "num_base_bdevs_operational": 2, 00:29:03.366 "base_bdevs_list": [ 00:29:03.366 { 00:29:03.366 "name": "spare", 00:29:03.366 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:03.366 "is_configured": true, 00:29:03.366 "data_offset": 256, 00:29:03.366 "data_size": 7936 00:29:03.366 }, 00:29:03.366 { 00:29:03.366 "name": "BaseBdev2", 00:29:03.366 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:03.366 "is_configured": true, 00:29:03.366 "data_offset": 256, 00:29:03.366 "data_size": 7936 00:29:03.366 } 00:29:03.366 ] 00:29:03.366 }' 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:03.366 12:10:16 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:03.625 12:10:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:03.625 "name": "raid_bdev1", 00:29:03.625 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:03.625 "strip_size_kb": 0, 00:29:03.625 "state": "online", 00:29:03.625 "raid_level": "raid1", 00:29:03.625 "superblock": true, 00:29:03.625 "num_base_bdevs": 2, 00:29:03.625 "num_base_bdevs_discovered": 2, 00:29:03.625 "num_base_bdevs_operational": 2, 00:29:03.625 "base_bdevs_list": [ 00:29:03.625 { 00:29:03.625 "name": "spare", 00:29:03.625 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:03.625 "is_configured": true, 00:29:03.625 "data_offset": 256, 00:29:03.625 "data_size": 7936 00:29:03.625 }, 00:29:03.625 { 00:29:03.625 "name": "BaseBdev2", 00:29:03.625 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:03.625 "is_configured": true, 00:29:03.625 "data_offset": 256, 00:29:03.625 "data_size": 7936 00:29:03.625 } 00:29:03.625 ] 00:29:03.625 }' 00:29:03.625 12:10:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:03.625 12:10:17 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:04.193 12:10:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:04.453 [2024-07-15 12:10:17.966982] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:04.453 [2024-07-15 12:10:17.967010] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:04.453 [2024-07-15 12:10:17.967070] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:04.453 [2024-07-15 12:10:17.967126] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:04.453 [2024-07-15 12:10:17.967138] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb6bab0 name raid_bdev1, state offline 00:29:04.453 12:10:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:04.453 12:10:17 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:04.712 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:29:05.281 /dev/nbd0 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:05.281 1+0 records in 00:29:05.281 1+0 records out 00:29:05.281 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259395 s, 15.8 MB/s 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:05.281 12:10:18 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:29:05.542 /dev/nbd1 00:29:05.542 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:29:05.542 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:29:05.542 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:29:05.542 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local i 00:29:05.542 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:05.542 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # break 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:05.543 1+0 records in 00:29:05.543 1+0 records out 00:29:05.543 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235649 s, 17.4 MB/s 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # size=4096 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # return 0 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:29:05.543 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:29:05.841 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:29:05.841 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:05.841 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:29:05.841 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:05.841 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:29:05.841 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:05.841 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:05.841 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:29:06.137 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:06.396 12:10:19 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:06.655 [2024-07-15 12:10:20.171509] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:06.655 [2024-07-15 12:10:20.171565] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:06.655 [2024-07-15 12:10:20.171585] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb67d50 00:29:06.655 [2024-07-15 12:10:20.171598] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:06.655 [2024-07-15 12:10:20.173267] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:06.655 [2024-07-15 12:10:20.173297] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:06.655 [2024-07-15 12:10:20.173362] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:06.655 [2024-07-15 12:10:20.173386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:06.655 [2024-07-15 12:10:20.173505] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:06.655 spare 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:06.655 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:06.915 [2024-07-15 12:10:20.273815] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x9bba40 00:29:06.915 [2024-07-15 12:10:20.273829] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:06.915 [2024-07-15 12:10:20.274009] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9c25c0 00:29:06.915 [2024-07-15 12:10:20.274148] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x9bba40 00:29:06.915 [2024-07-15 12:10:20.274158] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x9bba40 00:29:06.915 [2024-07-15 12:10:20.274260] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:06.915 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:06.915 "name": "raid_bdev1", 00:29:06.915 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:06.915 "strip_size_kb": 0, 00:29:06.915 "state": "online", 00:29:06.915 "raid_level": "raid1", 00:29:06.915 "superblock": true, 00:29:06.915 "num_base_bdevs": 2, 00:29:06.915 "num_base_bdevs_discovered": 2, 00:29:06.915 "num_base_bdevs_operational": 2, 00:29:06.915 "base_bdevs_list": [ 00:29:06.915 { 00:29:06.915 "name": "spare", 00:29:06.915 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:06.915 "is_configured": true, 00:29:06.915 "data_offset": 256, 00:29:06.915 "data_size": 7936 00:29:06.915 }, 00:29:06.915 { 00:29:06.915 "name": "BaseBdev2", 00:29:06.915 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:06.915 "is_configured": true, 00:29:06.915 "data_offset": 256, 00:29:06.915 "data_size": 7936 00:29:06.915 } 00:29:06.915 ] 00:29:06.915 }' 00:29:06.915 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:06.915 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:07.483 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:07.483 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:07.483 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:07.483 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:07.483 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:07.483 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.483 12:10:20 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:07.741 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:07.741 "name": "raid_bdev1", 00:29:07.741 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:07.741 "strip_size_kb": 0, 00:29:07.741 "state": "online", 00:29:07.741 "raid_level": "raid1", 00:29:07.741 "superblock": true, 00:29:07.741 "num_base_bdevs": 2, 00:29:07.741 "num_base_bdevs_discovered": 2, 00:29:07.741 "num_base_bdevs_operational": 2, 00:29:07.741 "base_bdevs_list": [ 00:29:07.741 { 00:29:07.741 "name": "spare", 00:29:07.741 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:07.741 "is_configured": true, 00:29:07.741 "data_offset": 256, 00:29:07.741 "data_size": 7936 00:29:07.741 }, 00:29:07.741 { 00:29:07.741 "name": "BaseBdev2", 00:29:07.741 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:07.741 "is_configured": true, 00:29:07.741 "data_offset": 256, 00:29:07.741 "data_size": 7936 00:29:07.741 } 00:29:07.741 ] 00:29:07.741 }' 00:29:07.741 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:07.741 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:07.741 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:07.741 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:07.741 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:07.741 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:29:07.999 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:29:07.999 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:08.258 [2024-07-15 12:10:21.783913] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:08.258 12:10:21 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:08.517 12:10:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:08.517 "name": "raid_bdev1", 00:29:08.517 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:08.517 "strip_size_kb": 0, 00:29:08.517 "state": "online", 00:29:08.517 "raid_level": "raid1", 00:29:08.517 "superblock": true, 00:29:08.517 "num_base_bdevs": 2, 00:29:08.517 "num_base_bdevs_discovered": 1, 00:29:08.517 "num_base_bdevs_operational": 1, 00:29:08.517 "base_bdevs_list": [ 00:29:08.517 { 00:29:08.517 "name": null, 00:29:08.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:08.517 "is_configured": false, 00:29:08.517 "data_offset": 256, 00:29:08.517 "data_size": 7936 00:29:08.517 }, 00:29:08.517 { 00:29:08.517 "name": "BaseBdev2", 00:29:08.517 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:08.517 "is_configured": true, 00:29:08.517 "data_offset": 256, 00:29:08.517 "data_size": 7936 00:29:08.517 } 00:29:08.517 ] 00:29:08.517 }' 00:29:08.517 12:10:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:08.517 12:10:22 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:09.085 12:10:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:09.344 [2024-07-15 12:10:22.878823] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:09.344 [2024-07-15 12:10:22.878970] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:09.344 [2024-07-15 12:10:22.878988] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:09.344 [2024-07-15 12:10:22.879014] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:09.344 [2024-07-15 12:10:22.883785] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb665b0 00:29:09.344 [2024-07-15 12:10:22.885128] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:09.344 12:10:22 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:29:10.721 12:10:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:10.721 12:10:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:10.721 12:10:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:10.721 12:10:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:10.721 12:10:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:10.721 12:10:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:10.721 12:10:23 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:10.721 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:10.721 "name": "raid_bdev1", 00:29:10.721 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:10.721 "strip_size_kb": 0, 00:29:10.721 "state": "online", 00:29:10.721 "raid_level": "raid1", 00:29:10.721 "superblock": true, 00:29:10.721 "num_base_bdevs": 2, 00:29:10.721 "num_base_bdevs_discovered": 2, 00:29:10.721 "num_base_bdevs_operational": 2, 00:29:10.721 "process": { 00:29:10.721 "type": "rebuild", 00:29:10.721 "target": "spare", 00:29:10.721 "progress": { 00:29:10.721 "blocks": 3072, 00:29:10.721 "percent": 38 00:29:10.721 } 00:29:10.721 }, 00:29:10.721 "base_bdevs_list": [ 00:29:10.721 { 00:29:10.721 "name": "spare", 00:29:10.721 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:10.721 "is_configured": true, 00:29:10.721 "data_offset": 256, 00:29:10.721 "data_size": 7936 00:29:10.721 }, 00:29:10.721 { 00:29:10.721 "name": "BaseBdev2", 00:29:10.721 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:10.721 "is_configured": true, 00:29:10.721 "data_offset": 256, 00:29:10.721 "data_size": 7936 00:29:10.721 } 00:29:10.721 ] 00:29:10.721 }' 00:29:10.721 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:10.721 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:10.721 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:10.721 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:10.721 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:11.290 [2024-07-15 12:10:24.733269] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:11.290 [2024-07-15 12:10:24.799882] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:11.290 [2024-07-15 12:10:24.799925] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:11.290 [2024-07-15 12:10:24.799939] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:11.290 [2024-07-15 12:10:24.799948] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:11.290 12:10:24 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:11.549 12:10:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:11.550 "name": "raid_bdev1", 00:29:11.550 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:11.550 "strip_size_kb": 0, 00:29:11.550 "state": "online", 00:29:11.550 "raid_level": "raid1", 00:29:11.550 "superblock": true, 00:29:11.550 "num_base_bdevs": 2, 00:29:11.550 "num_base_bdevs_discovered": 1, 00:29:11.550 "num_base_bdevs_operational": 1, 00:29:11.550 "base_bdevs_list": [ 00:29:11.550 { 00:29:11.550 "name": null, 00:29:11.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:11.550 "is_configured": false, 00:29:11.550 "data_offset": 256, 00:29:11.550 "data_size": 7936 00:29:11.550 }, 00:29:11.550 { 00:29:11.550 "name": "BaseBdev2", 00:29:11.550 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:11.550 "is_configured": true, 00:29:11.550 "data_offset": 256, 00:29:11.550 "data_size": 7936 00:29:11.550 } 00:29:11.550 ] 00:29:11.550 }' 00:29:11.550 12:10:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:11.550 12:10:25 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:12.117 12:10:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:12.376 [2024-07-15 12:10:25.911539] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:12.376 [2024-07-15 12:10:25.911586] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:12.376 [2024-07-15 12:10:25.911611] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9bbdc0 00:29:12.376 [2024-07-15 12:10:25.911623] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:12.376 [2024-07-15 12:10:25.912000] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:12.376 [2024-07-15 12:10:25.912018] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:12.376 [2024-07-15 12:10:25.912102] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:29:12.376 [2024-07-15 12:10:25.912114] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:29:12.376 [2024-07-15 12:10:25.912124] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:29:12.376 [2024-07-15 12:10:25.912142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:12.376 [2024-07-15 12:10:25.917154] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x9bfb10 00:29:12.376 spare 00:29:12.376 [2024-07-15 12:10:25.918514] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:12.376 12:10:25 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:29:13.756 12:10:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:13.756 12:10:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:13.756 12:10:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:13.756 12:10:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:13.756 12:10:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:13.756 12:10:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:13.756 12:10:26 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:13.756 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:13.756 "name": "raid_bdev1", 00:29:13.756 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:13.756 "strip_size_kb": 0, 00:29:13.756 "state": "online", 00:29:13.756 "raid_level": "raid1", 00:29:13.756 "superblock": true, 00:29:13.756 "num_base_bdevs": 2, 00:29:13.756 "num_base_bdevs_discovered": 2, 00:29:13.756 "num_base_bdevs_operational": 2, 00:29:13.756 "process": { 00:29:13.756 "type": "rebuild", 00:29:13.756 "target": "spare", 00:29:13.756 "progress": { 00:29:13.756 "blocks": 3072, 00:29:13.756 "percent": 38 00:29:13.756 } 00:29:13.756 }, 00:29:13.756 "base_bdevs_list": [ 00:29:13.756 { 00:29:13.756 "name": "spare", 00:29:13.756 "uuid": "a9ac6080-0b9c-55fa-853d-b23ac73521d0", 00:29:13.756 "is_configured": true, 00:29:13.756 "data_offset": 256, 00:29:13.756 "data_size": 7936 00:29:13.756 }, 00:29:13.756 { 00:29:13.756 "name": "BaseBdev2", 00:29:13.756 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:13.756 "is_configured": true, 00:29:13.756 "data_offset": 256, 00:29:13.756 "data_size": 7936 00:29:13.756 } 00:29:13.756 ] 00:29:13.756 }' 00:29:13.756 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:13.756 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:13.756 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:13.756 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:13.756 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:29:14.014 [2024-07-15 12:10:27.513641] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:14.014 [2024-07-15 12:10:27.531406] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:14.014 [2024-07-15 12:10:27.531447] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:14.014 [2024-07-15 12:10:27.531461] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:14.014 [2024-07-15 12:10:27.531470] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:14.014 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:14.273 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:14.273 "name": "raid_bdev1", 00:29:14.273 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:14.273 "strip_size_kb": 0, 00:29:14.273 "state": "online", 00:29:14.273 "raid_level": "raid1", 00:29:14.273 "superblock": true, 00:29:14.273 "num_base_bdevs": 2, 00:29:14.273 "num_base_bdevs_discovered": 1, 00:29:14.273 "num_base_bdevs_operational": 1, 00:29:14.274 "base_bdevs_list": [ 00:29:14.274 { 00:29:14.274 "name": null, 00:29:14.274 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:14.274 "is_configured": false, 00:29:14.274 "data_offset": 256, 00:29:14.274 "data_size": 7936 00:29:14.274 }, 00:29:14.274 { 00:29:14.274 "name": "BaseBdev2", 00:29:14.274 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:14.274 "is_configured": true, 00:29:14.274 "data_offset": 256, 00:29:14.274 "data_size": 7936 00:29:14.274 } 00:29:14.274 ] 00:29:14.274 }' 00:29:14.274 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:14.274 12:10:27 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:14.842 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:14.842 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:14.842 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:14.842 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:14.842 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:14.842 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:14.842 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:15.102 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:15.102 "name": "raid_bdev1", 00:29:15.102 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:15.102 "strip_size_kb": 0, 00:29:15.102 "state": "online", 00:29:15.102 "raid_level": "raid1", 00:29:15.102 "superblock": true, 00:29:15.102 "num_base_bdevs": 2, 00:29:15.102 "num_base_bdevs_discovered": 1, 00:29:15.102 "num_base_bdevs_operational": 1, 00:29:15.102 "base_bdevs_list": [ 00:29:15.102 { 00:29:15.102 "name": null, 00:29:15.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:15.102 "is_configured": false, 00:29:15.102 "data_offset": 256, 00:29:15.102 "data_size": 7936 00:29:15.102 }, 00:29:15.102 { 00:29:15.102 "name": "BaseBdev2", 00:29:15.102 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:15.102 "is_configured": true, 00:29:15.102 "data_offset": 256, 00:29:15.102 "data_size": 7936 00:29:15.102 } 00:29:15.102 ] 00:29:15.102 }' 00:29:15.102 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:15.361 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:15.362 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:15.362 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:15.362 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:29:15.621 12:10:28 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:15.621 [2024-07-15 12:10:29.216257] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:15.621 [2024-07-15 12:10:29.216305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:15.621 [2024-07-15 12:10:29.216325] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9c2bf0 00:29:15.621 [2024-07-15 12:10:29.216338] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:15.621 [2024-07-15 12:10:29.216666] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:15.621 [2024-07-15 12:10:29.216682] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:15.621 [2024-07-15 12:10:29.216754] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:29:15.621 [2024-07-15 12:10:29.216766] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:15.621 [2024-07-15 12:10:29.216775] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:15.880 BaseBdev1 00:29:15.880 12:10:29 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:16.818 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.077 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:17.077 "name": "raid_bdev1", 00:29:17.077 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:17.077 "strip_size_kb": 0, 00:29:17.077 "state": "online", 00:29:17.077 "raid_level": "raid1", 00:29:17.077 "superblock": true, 00:29:17.077 "num_base_bdevs": 2, 00:29:17.077 "num_base_bdevs_discovered": 1, 00:29:17.077 "num_base_bdevs_operational": 1, 00:29:17.077 "base_bdevs_list": [ 00:29:17.077 { 00:29:17.077 "name": null, 00:29:17.077 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:17.077 "is_configured": false, 00:29:17.077 "data_offset": 256, 00:29:17.077 "data_size": 7936 00:29:17.077 }, 00:29:17.077 { 00:29:17.077 "name": "BaseBdev2", 00:29:17.077 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:17.077 "is_configured": true, 00:29:17.077 "data_offset": 256, 00:29:17.077 "data_size": 7936 00:29:17.077 } 00:29:17.077 ] 00:29:17.077 }' 00:29:17.077 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:17.077 12:10:30 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:17.646 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:17.646 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:17.646 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:17.646 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:17.646 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:17.646 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:17.646 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:17.905 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:17.905 "name": "raid_bdev1", 00:29:17.905 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:17.905 "strip_size_kb": 0, 00:29:17.905 "state": "online", 00:29:17.905 "raid_level": "raid1", 00:29:17.905 "superblock": true, 00:29:17.905 "num_base_bdevs": 2, 00:29:17.905 "num_base_bdevs_discovered": 1, 00:29:17.905 "num_base_bdevs_operational": 1, 00:29:17.905 "base_bdevs_list": [ 00:29:17.905 { 00:29:17.905 "name": null, 00:29:17.906 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:17.906 "is_configured": false, 00:29:17.906 "data_offset": 256, 00:29:17.906 "data_size": 7936 00:29:17.906 }, 00:29:17.906 { 00:29:17.906 "name": "BaseBdev2", 00:29:17.906 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:17.906 "is_configured": true, 00:29:17.906 "data_offset": 256, 00:29:17.906 "data_size": 7936 00:29:17.906 } 00:29:17.906 ] 00:29:17.906 }' 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@648 -- # local es=0 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:17.906 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:29:18.165 [2024-07-15 12:10:31.626660] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:18.165 [2024-07-15 12:10:31.626784] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:29:18.165 [2024-07-15 12:10:31.626800] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:29:18.165 request: 00:29:18.165 { 00:29:18.165 "base_bdev": "BaseBdev1", 00:29:18.165 "raid_bdev": "raid_bdev1", 00:29:18.165 "method": "bdev_raid_add_base_bdev", 00:29:18.165 "req_id": 1 00:29:18.165 } 00:29:18.165 Got JSON-RPC error response 00:29:18.165 response: 00:29:18.165 { 00:29:18.165 "code": -22, 00:29:18.165 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:29:18.165 } 00:29:18.165 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # es=1 00:29:18.165 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:18.165 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:18.165 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:18.165 12:10:31 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:19.101 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:19.360 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:19.360 "name": "raid_bdev1", 00:29:19.360 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:19.360 "strip_size_kb": 0, 00:29:19.360 "state": "online", 00:29:19.360 "raid_level": "raid1", 00:29:19.360 "superblock": true, 00:29:19.360 "num_base_bdevs": 2, 00:29:19.360 "num_base_bdevs_discovered": 1, 00:29:19.360 "num_base_bdevs_operational": 1, 00:29:19.360 "base_bdevs_list": [ 00:29:19.360 { 00:29:19.360 "name": null, 00:29:19.360 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:19.360 "is_configured": false, 00:29:19.360 "data_offset": 256, 00:29:19.360 "data_size": 7936 00:29:19.360 }, 00:29:19.360 { 00:29:19.360 "name": "BaseBdev2", 00:29:19.360 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:19.360 "is_configured": true, 00:29:19.360 "data_offset": 256, 00:29:19.360 "data_size": 7936 00:29:19.360 } 00:29:19.360 ] 00:29:19.360 }' 00:29:19.360 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:19.360 12:10:32 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:19.928 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:19.928 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:19.928 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:19.928 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:19.928 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:19.928 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:20.187 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:20.187 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:20.187 "name": "raid_bdev1", 00:29:20.187 "uuid": "c95cf120-7b0a-48ff-987c-b5583f13bde9", 00:29:20.187 "strip_size_kb": 0, 00:29:20.187 "state": "online", 00:29:20.187 "raid_level": "raid1", 00:29:20.187 "superblock": true, 00:29:20.187 "num_base_bdevs": 2, 00:29:20.187 "num_base_bdevs_discovered": 1, 00:29:20.187 "num_base_bdevs_operational": 1, 00:29:20.187 "base_bdevs_list": [ 00:29:20.187 { 00:29:20.187 "name": null, 00:29:20.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:20.187 "is_configured": false, 00:29:20.187 "data_offset": 256, 00:29:20.187 "data_size": 7936 00:29:20.187 }, 00:29:20.187 { 00:29:20.187 "name": "BaseBdev2", 00:29:20.187 "uuid": "f0a9b02c-fa80-52c3-9a55-860f1f0e20ab", 00:29:20.187 "is_configured": true, 00:29:20.187 "data_offset": 256, 00:29:20.187 "data_size": 7936 00:29:20.187 } 00:29:20.187 ] 00:29:20.187 }' 00:29:20.187 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 1601339 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@948 -- # '[' -z 1601339 ']' 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@952 -- # kill -0 1601339 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # uname 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1601339 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1601339' 00:29:20.446 killing process with pid 1601339 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # kill 1601339 00:29:20.446 Received shutdown signal, test time was about 60.000000 seconds 00:29:20.446 00:29:20.446 Latency(us) 00:29:20.446 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:20.446 =================================================================================================================== 00:29:20.446 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:29:20.446 [2024-07-15 12:10:33.901098] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:20.446 [2024-07-15 12:10:33.901187] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:20.446 [2024-07-15 12:10:33.901231] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:20.446 [2024-07-15 12:10:33.901244] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x9bba40 name raid_bdev1, state offline 00:29:20.446 12:10:33 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@972 -- # wait 1601339 00:29:20.446 [2024-07-15 12:10:33.928155] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:20.705 12:10:34 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:29:20.705 00:29:20.705 real 0m32.367s 00:29:20.705 user 0m50.650s 00:29:20.705 sys 0m5.365s 00:29:20.705 12:10:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:20.705 12:10:34 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:29:20.705 ************************************ 00:29:20.705 END TEST raid_rebuild_test_sb_4k 00:29:20.705 ************************************ 00:29:20.705 12:10:34 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:20.705 12:10:34 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:29:20.705 12:10:34 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:29:20.705 12:10:34 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:29:20.705 12:10:34 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:20.705 12:10:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:20.705 ************************************ 00:29:20.705 START TEST raid_state_function_test_sb_md_separate 00:29:20.705 ************************************ 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1605846 00:29:20.705 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1605846' 00:29:20.705 Process raid pid: 1605846 00:29:20.706 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:29:20.706 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1605846 /var/tmp/spdk-raid.sock 00:29:20.706 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1605846 ']' 00:29:20.706 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:20.706 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:20.706 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:20.706 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:20.706 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:20.706 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:20.965 [2024-07-15 12:10:34.312662] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:29:20.965 [2024-07-15 12:10:34.312775] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:29:20.965 [2024-07-15 12:10:34.433980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:20.965 [2024-07-15 12:10:34.531999] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:21.224 [2024-07-15 12:10:34.592708] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:21.224 [2024-07-15 12:10:34.592739] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:21.224 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:21.224 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:29:21.224 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:21.482 [2024-07-15 12:10:34.886824] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:21.482 [2024-07-15 12:10:34.886868] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:21.482 [2024-07-15 12:10:34.886878] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:21.482 [2024-07-15 12:10:34.886890] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:21.482 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:21.482 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:21.482 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:21.482 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:21.482 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:21.482 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:21.483 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:21.483 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:21.483 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:21.483 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:21.483 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:21.483 12:10:34 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:21.742 12:10:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:21.742 "name": "Existed_Raid", 00:29:21.742 "uuid": "674db4a7-7961-40f9-8859-2178741b2eb4", 00:29:21.742 "strip_size_kb": 0, 00:29:21.742 "state": "configuring", 00:29:21.742 "raid_level": "raid1", 00:29:21.742 "superblock": true, 00:29:21.742 "num_base_bdevs": 2, 00:29:21.742 "num_base_bdevs_discovered": 0, 00:29:21.742 "num_base_bdevs_operational": 2, 00:29:21.742 "base_bdevs_list": [ 00:29:21.742 { 00:29:21.742 "name": "BaseBdev1", 00:29:21.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:21.742 "is_configured": false, 00:29:21.742 "data_offset": 0, 00:29:21.742 "data_size": 0 00:29:21.742 }, 00:29:21.742 { 00:29:21.742 "name": "BaseBdev2", 00:29:21.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:21.742 "is_configured": false, 00:29:21.742 "data_offset": 0, 00:29:21.742 "data_size": 0 00:29:21.742 } 00:29:21.742 ] 00:29:21.742 }' 00:29:21.742 12:10:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:21.742 12:10:35 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:22.309 12:10:35 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:22.568 [2024-07-15 12:10:35.989592] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:22.568 [2024-07-15 12:10:35.989621] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ccb00 name Existed_Raid, state configuring 00:29:22.568 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:22.827 [2024-07-15 12:10:36.234275] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:29:22.827 [2024-07-15 12:10:36.234306] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:29:22.827 [2024-07-15 12:10:36.234315] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:22.827 [2024-07-15 12:10:36.234326] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:22.827 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:29:23.085 [2024-07-15 12:10:36.489335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:23.085 BaseBdev1 00:29:23.085 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:29:23.085 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:29:23.085 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:23.085 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:29:23.085 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:23.085 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:23.085 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:23.344 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:29:23.602 [ 00:29:23.602 { 00:29:23.602 "name": "BaseBdev1", 00:29:23.602 "aliases": [ 00:29:23.602 "e1232cd8-8393-4efd-984f-ff9c5d05bdb7" 00:29:23.602 ], 00:29:23.602 "product_name": "Malloc disk", 00:29:23.602 "block_size": 4096, 00:29:23.602 "num_blocks": 8192, 00:29:23.602 "uuid": "e1232cd8-8393-4efd-984f-ff9c5d05bdb7", 00:29:23.602 "md_size": 32, 00:29:23.602 "md_interleave": false, 00:29:23.602 "dif_type": 0, 00:29:23.602 "assigned_rate_limits": { 00:29:23.602 "rw_ios_per_sec": 0, 00:29:23.602 "rw_mbytes_per_sec": 0, 00:29:23.602 "r_mbytes_per_sec": 0, 00:29:23.602 "w_mbytes_per_sec": 0 00:29:23.602 }, 00:29:23.602 "claimed": true, 00:29:23.602 "claim_type": "exclusive_write", 00:29:23.602 "zoned": false, 00:29:23.602 "supported_io_types": { 00:29:23.602 "read": true, 00:29:23.602 "write": true, 00:29:23.602 "unmap": true, 00:29:23.602 "flush": true, 00:29:23.602 "reset": true, 00:29:23.602 "nvme_admin": false, 00:29:23.602 "nvme_io": false, 00:29:23.602 "nvme_io_md": false, 00:29:23.602 "write_zeroes": true, 00:29:23.602 "zcopy": true, 00:29:23.602 "get_zone_info": false, 00:29:23.602 "zone_management": false, 00:29:23.602 "zone_append": false, 00:29:23.602 "compare": false, 00:29:23.602 "compare_and_write": false, 00:29:23.602 "abort": true, 00:29:23.602 "seek_hole": false, 00:29:23.602 "seek_data": false, 00:29:23.602 "copy": true, 00:29:23.602 "nvme_iov_md": false 00:29:23.602 }, 00:29:23.602 "memory_domains": [ 00:29:23.602 { 00:29:23.602 "dma_device_id": "system", 00:29:23.602 "dma_device_type": 1 00:29:23.602 }, 00:29:23.602 { 00:29:23.602 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:23.602 "dma_device_type": 2 00:29:23.602 } 00:29:23.602 ], 00:29:23.602 "driver_specific": {} 00:29:23.602 } 00:29:23.602 ] 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:23.602 12:10:36 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:23.861 12:10:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:23.861 "name": "Existed_Raid", 00:29:23.861 "uuid": "0e491714-2b93-44bc-83bd-168c3193d68b", 00:29:23.861 "strip_size_kb": 0, 00:29:23.861 "state": "configuring", 00:29:23.861 "raid_level": "raid1", 00:29:23.861 "superblock": true, 00:29:23.861 "num_base_bdevs": 2, 00:29:23.861 "num_base_bdevs_discovered": 1, 00:29:23.861 "num_base_bdevs_operational": 2, 00:29:23.861 "base_bdevs_list": [ 00:29:23.861 { 00:29:23.861 "name": "BaseBdev1", 00:29:23.861 "uuid": "e1232cd8-8393-4efd-984f-ff9c5d05bdb7", 00:29:23.861 "is_configured": true, 00:29:23.861 "data_offset": 256, 00:29:23.861 "data_size": 7936 00:29:23.861 }, 00:29:23.861 { 00:29:23.861 "name": "BaseBdev2", 00:29:23.861 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:23.861 "is_configured": false, 00:29:23.861 "data_offset": 0, 00:29:23.861 "data_size": 0 00:29:23.861 } 00:29:23.861 ] 00:29:23.861 }' 00:29:23.861 12:10:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:23.861 12:10:37 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:24.426 12:10:37 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:29:24.683 [2024-07-15 12:10:38.169786] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:29:24.683 [2024-07-15 12:10:38.169829] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12cc3d0 name Existed_Raid, state configuring 00:29:24.683 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:29:24.941 [2024-07-15 12:10:38.402452] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:24.941 [2024-07-15 12:10:38.403909] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:29:24.941 [2024-07-15 12:10:38.403943] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:24.942 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:25.200 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:25.200 "name": "Existed_Raid", 00:29:25.200 "uuid": "6406752e-bd24-4a5d-996b-88fecde972a7", 00:29:25.200 "strip_size_kb": 0, 00:29:25.200 "state": "configuring", 00:29:25.200 "raid_level": "raid1", 00:29:25.200 "superblock": true, 00:29:25.200 "num_base_bdevs": 2, 00:29:25.200 "num_base_bdevs_discovered": 1, 00:29:25.200 "num_base_bdevs_operational": 2, 00:29:25.200 "base_bdevs_list": [ 00:29:25.200 { 00:29:25.200 "name": "BaseBdev1", 00:29:25.200 "uuid": "e1232cd8-8393-4efd-984f-ff9c5d05bdb7", 00:29:25.200 "is_configured": true, 00:29:25.200 "data_offset": 256, 00:29:25.200 "data_size": 7936 00:29:25.200 }, 00:29:25.200 { 00:29:25.200 "name": "BaseBdev2", 00:29:25.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:25.200 "is_configured": false, 00:29:25.200 "data_offset": 0, 00:29:25.200 "data_size": 0 00:29:25.200 } 00:29:25.200 ] 00:29:25.200 }' 00:29:25.200 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:25.200 12:10:38 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:25.768 12:10:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:29:26.028 [2024-07-15 12:10:39.533433] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:26.028 [2024-07-15 12:10:39.533581] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12ce240 00:29:26.028 [2024-07-15 12:10:39.533594] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:26.028 [2024-07-15 12:10:39.533658] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12cca90 00:29:26.028 [2024-07-15 12:10:39.533774] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12ce240 00:29:26.028 [2024-07-15 12:10:39.533785] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12ce240 00:29:26.028 [2024-07-15 12:10:39.533853] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:26.028 BaseBdev2 00:29:26.028 12:10:39 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:29:26.028 12:10:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:29:26.028 12:10:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:29:26.028 12:10:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local i 00:29:26.028 12:10:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:29:26.028 12:10:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:29:26.028 12:10:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:29:26.287 12:10:39 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:29:26.546 [ 00:29:26.546 { 00:29:26.546 "name": "BaseBdev2", 00:29:26.546 "aliases": [ 00:29:26.546 "9956c440-6eb4-4b9b-bcc7-77d38871a1c1" 00:29:26.546 ], 00:29:26.546 "product_name": "Malloc disk", 00:29:26.546 "block_size": 4096, 00:29:26.546 "num_blocks": 8192, 00:29:26.546 "uuid": "9956c440-6eb4-4b9b-bcc7-77d38871a1c1", 00:29:26.546 "md_size": 32, 00:29:26.546 "md_interleave": false, 00:29:26.546 "dif_type": 0, 00:29:26.546 "assigned_rate_limits": { 00:29:26.546 "rw_ios_per_sec": 0, 00:29:26.546 "rw_mbytes_per_sec": 0, 00:29:26.546 "r_mbytes_per_sec": 0, 00:29:26.546 "w_mbytes_per_sec": 0 00:29:26.546 }, 00:29:26.546 "claimed": true, 00:29:26.546 "claim_type": "exclusive_write", 00:29:26.546 "zoned": false, 00:29:26.546 "supported_io_types": { 00:29:26.546 "read": true, 00:29:26.546 "write": true, 00:29:26.546 "unmap": true, 00:29:26.546 "flush": true, 00:29:26.546 "reset": true, 00:29:26.546 "nvme_admin": false, 00:29:26.546 "nvme_io": false, 00:29:26.546 "nvme_io_md": false, 00:29:26.546 "write_zeroes": true, 00:29:26.546 "zcopy": true, 00:29:26.546 "get_zone_info": false, 00:29:26.546 "zone_management": false, 00:29:26.546 "zone_append": false, 00:29:26.546 "compare": false, 00:29:26.546 "compare_and_write": false, 00:29:26.546 "abort": true, 00:29:26.546 "seek_hole": false, 00:29:26.546 "seek_data": false, 00:29:26.546 "copy": true, 00:29:26.546 "nvme_iov_md": false 00:29:26.546 }, 00:29:26.546 "memory_domains": [ 00:29:26.546 { 00:29:26.546 "dma_device_id": "system", 00:29:26.546 "dma_device_type": 1 00:29:26.546 }, 00:29:26.546 { 00:29:26.546 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:26.546 "dma_device_type": 2 00:29:26.546 } 00:29:26.546 ], 00:29:26.546 "driver_specific": {} 00:29:26.546 } 00:29:26.546 ] 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # return 0 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:26.546 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:26.806 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:26.806 "name": "Existed_Raid", 00:29:26.806 "uuid": "6406752e-bd24-4a5d-996b-88fecde972a7", 00:29:26.806 "strip_size_kb": 0, 00:29:26.806 "state": "online", 00:29:26.806 "raid_level": "raid1", 00:29:26.806 "superblock": true, 00:29:26.806 "num_base_bdevs": 2, 00:29:26.806 "num_base_bdevs_discovered": 2, 00:29:26.806 "num_base_bdevs_operational": 2, 00:29:26.806 "base_bdevs_list": [ 00:29:26.806 { 00:29:26.806 "name": "BaseBdev1", 00:29:26.806 "uuid": "e1232cd8-8393-4efd-984f-ff9c5d05bdb7", 00:29:26.806 "is_configured": true, 00:29:26.806 "data_offset": 256, 00:29:26.806 "data_size": 7936 00:29:26.806 }, 00:29:26.806 { 00:29:26.806 "name": "BaseBdev2", 00:29:26.806 "uuid": "9956c440-6eb4-4b9b-bcc7-77d38871a1c1", 00:29:26.806 "is_configured": true, 00:29:26.806 "data_offset": 256, 00:29:26.806 "data_size": 7936 00:29:26.806 } 00:29:26.806 ] 00:29:26.806 }' 00:29:26.806 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:26.806 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:27.374 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:29:27.374 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:29:27.374 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:27.374 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:27.374 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:27.374 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:27.374 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:27.374 12:10:40 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:29:27.633 [2024-07-15 12:10:41.117958] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:27.633 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:27.634 "name": "Existed_Raid", 00:29:27.634 "aliases": [ 00:29:27.634 "6406752e-bd24-4a5d-996b-88fecde972a7" 00:29:27.634 ], 00:29:27.634 "product_name": "Raid Volume", 00:29:27.634 "block_size": 4096, 00:29:27.634 "num_blocks": 7936, 00:29:27.634 "uuid": "6406752e-bd24-4a5d-996b-88fecde972a7", 00:29:27.634 "md_size": 32, 00:29:27.634 "md_interleave": false, 00:29:27.634 "dif_type": 0, 00:29:27.634 "assigned_rate_limits": { 00:29:27.634 "rw_ios_per_sec": 0, 00:29:27.634 "rw_mbytes_per_sec": 0, 00:29:27.634 "r_mbytes_per_sec": 0, 00:29:27.634 "w_mbytes_per_sec": 0 00:29:27.634 }, 00:29:27.634 "claimed": false, 00:29:27.634 "zoned": false, 00:29:27.634 "supported_io_types": { 00:29:27.634 "read": true, 00:29:27.634 "write": true, 00:29:27.634 "unmap": false, 00:29:27.634 "flush": false, 00:29:27.634 "reset": true, 00:29:27.634 "nvme_admin": false, 00:29:27.634 "nvme_io": false, 00:29:27.634 "nvme_io_md": false, 00:29:27.634 "write_zeroes": true, 00:29:27.634 "zcopy": false, 00:29:27.634 "get_zone_info": false, 00:29:27.634 "zone_management": false, 00:29:27.634 "zone_append": false, 00:29:27.634 "compare": false, 00:29:27.634 "compare_and_write": false, 00:29:27.634 "abort": false, 00:29:27.634 "seek_hole": false, 00:29:27.634 "seek_data": false, 00:29:27.634 "copy": false, 00:29:27.634 "nvme_iov_md": false 00:29:27.634 }, 00:29:27.634 "memory_domains": [ 00:29:27.634 { 00:29:27.634 "dma_device_id": "system", 00:29:27.634 "dma_device_type": 1 00:29:27.634 }, 00:29:27.634 { 00:29:27.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:27.634 "dma_device_type": 2 00:29:27.634 }, 00:29:27.634 { 00:29:27.634 "dma_device_id": "system", 00:29:27.634 "dma_device_type": 1 00:29:27.634 }, 00:29:27.634 { 00:29:27.634 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:27.634 "dma_device_type": 2 00:29:27.634 } 00:29:27.634 ], 00:29:27.634 "driver_specific": { 00:29:27.634 "raid": { 00:29:27.634 "uuid": "6406752e-bd24-4a5d-996b-88fecde972a7", 00:29:27.634 "strip_size_kb": 0, 00:29:27.634 "state": "online", 00:29:27.634 "raid_level": "raid1", 00:29:27.634 "superblock": true, 00:29:27.634 "num_base_bdevs": 2, 00:29:27.634 "num_base_bdevs_discovered": 2, 00:29:27.634 "num_base_bdevs_operational": 2, 00:29:27.634 "base_bdevs_list": [ 00:29:27.634 { 00:29:27.634 "name": "BaseBdev1", 00:29:27.634 "uuid": "e1232cd8-8393-4efd-984f-ff9c5d05bdb7", 00:29:27.634 "is_configured": true, 00:29:27.634 "data_offset": 256, 00:29:27.634 "data_size": 7936 00:29:27.634 }, 00:29:27.634 { 00:29:27.634 "name": "BaseBdev2", 00:29:27.634 "uuid": "9956c440-6eb4-4b9b-bcc7-77d38871a1c1", 00:29:27.634 "is_configured": true, 00:29:27.634 "data_offset": 256, 00:29:27.634 "data_size": 7936 00:29:27.634 } 00:29:27.634 ] 00:29:27.634 } 00:29:27.634 } 00:29:27.634 }' 00:29:27.634 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:27.634 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:29:27.634 BaseBdev2' 00:29:27.634 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:27.634 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:29:27.634 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:27.893 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:27.893 "name": "BaseBdev1", 00:29:27.893 "aliases": [ 00:29:27.893 "e1232cd8-8393-4efd-984f-ff9c5d05bdb7" 00:29:27.893 ], 00:29:27.893 "product_name": "Malloc disk", 00:29:27.893 "block_size": 4096, 00:29:27.893 "num_blocks": 8192, 00:29:27.893 "uuid": "e1232cd8-8393-4efd-984f-ff9c5d05bdb7", 00:29:27.893 "md_size": 32, 00:29:27.893 "md_interleave": false, 00:29:27.893 "dif_type": 0, 00:29:27.893 "assigned_rate_limits": { 00:29:27.893 "rw_ios_per_sec": 0, 00:29:27.893 "rw_mbytes_per_sec": 0, 00:29:27.893 "r_mbytes_per_sec": 0, 00:29:27.893 "w_mbytes_per_sec": 0 00:29:27.893 }, 00:29:27.893 "claimed": true, 00:29:27.893 "claim_type": "exclusive_write", 00:29:27.893 "zoned": false, 00:29:27.893 "supported_io_types": { 00:29:27.893 "read": true, 00:29:27.893 "write": true, 00:29:27.893 "unmap": true, 00:29:27.893 "flush": true, 00:29:27.893 "reset": true, 00:29:27.893 "nvme_admin": false, 00:29:27.893 "nvme_io": false, 00:29:27.893 "nvme_io_md": false, 00:29:27.893 "write_zeroes": true, 00:29:27.893 "zcopy": true, 00:29:27.893 "get_zone_info": false, 00:29:27.893 "zone_management": false, 00:29:27.893 "zone_append": false, 00:29:27.893 "compare": false, 00:29:27.893 "compare_and_write": false, 00:29:27.893 "abort": true, 00:29:27.893 "seek_hole": false, 00:29:27.893 "seek_data": false, 00:29:27.893 "copy": true, 00:29:27.893 "nvme_iov_md": false 00:29:27.893 }, 00:29:27.893 "memory_domains": [ 00:29:27.893 { 00:29:27.893 "dma_device_id": "system", 00:29:27.893 "dma_device_type": 1 00:29:27.893 }, 00:29:27.893 { 00:29:27.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:27.893 "dma_device_type": 2 00:29:27.893 } 00:29:27.893 ], 00:29:27.893 "driver_specific": {} 00:29:27.893 }' 00:29:27.893 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:28.153 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:28.153 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:28.153 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:28.153 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:28.153 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:28.153 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:28.153 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:28.153 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:28.153 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:28.411 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:28.411 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:28.411 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:28.411 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:29:28.411 12:10:41 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:28.670 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:28.670 "name": "BaseBdev2", 00:29:28.670 "aliases": [ 00:29:28.670 "9956c440-6eb4-4b9b-bcc7-77d38871a1c1" 00:29:28.670 ], 00:29:28.670 "product_name": "Malloc disk", 00:29:28.670 "block_size": 4096, 00:29:28.670 "num_blocks": 8192, 00:29:28.670 "uuid": "9956c440-6eb4-4b9b-bcc7-77d38871a1c1", 00:29:28.670 "md_size": 32, 00:29:28.670 "md_interleave": false, 00:29:28.670 "dif_type": 0, 00:29:28.670 "assigned_rate_limits": { 00:29:28.670 "rw_ios_per_sec": 0, 00:29:28.670 "rw_mbytes_per_sec": 0, 00:29:28.670 "r_mbytes_per_sec": 0, 00:29:28.670 "w_mbytes_per_sec": 0 00:29:28.670 }, 00:29:28.670 "claimed": true, 00:29:28.670 "claim_type": "exclusive_write", 00:29:28.670 "zoned": false, 00:29:28.670 "supported_io_types": { 00:29:28.670 "read": true, 00:29:28.670 "write": true, 00:29:28.670 "unmap": true, 00:29:28.670 "flush": true, 00:29:28.670 "reset": true, 00:29:28.670 "nvme_admin": false, 00:29:28.670 "nvme_io": false, 00:29:28.670 "nvme_io_md": false, 00:29:28.670 "write_zeroes": true, 00:29:28.670 "zcopy": true, 00:29:28.670 "get_zone_info": false, 00:29:28.670 "zone_management": false, 00:29:28.670 "zone_append": false, 00:29:28.670 "compare": false, 00:29:28.670 "compare_and_write": false, 00:29:28.670 "abort": true, 00:29:28.670 "seek_hole": false, 00:29:28.670 "seek_data": false, 00:29:28.670 "copy": true, 00:29:28.670 "nvme_iov_md": false 00:29:28.670 }, 00:29:28.670 "memory_domains": [ 00:29:28.670 { 00:29:28.670 "dma_device_id": "system", 00:29:28.670 "dma_device_type": 1 00:29:28.670 }, 00:29:28.670 { 00:29:28.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:28.670 "dma_device_type": 2 00:29:28.670 } 00:29:28.670 ], 00:29:28.670 "driver_specific": {} 00:29:28.670 }' 00:29:28.670 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:28.670 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:28.670 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:28.670 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:28.670 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:28.670 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:28.670 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:28.928 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:28.928 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:28.928 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:28.928 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:28.928 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:28.928 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:29:29.186 [2024-07-15 12:10:42.645764] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:29.186 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:29:29.445 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:29.445 "name": "Existed_Raid", 00:29:29.445 "uuid": "6406752e-bd24-4a5d-996b-88fecde972a7", 00:29:29.445 "strip_size_kb": 0, 00:29:29.445 "state": "online", 00:29:29.445 "raid_level": "raid1", 00:29:29.445 "superblock": true, 00:29:29.445 "num_base_bdevs": 2, 00:29:29.445 "num_base_bdevs_discovered": 1, 00:29:29.445 "num_base_bdevs_operational": 1, 00:29:29.445 "base_bdevs_list": [ 00:29:29.445 { 00:29:29.445 "name": null, 00:29:29.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:29.445 "is_configured": false, 00:29:29.445 "data_offset": 256, 00:29:29.445 "data_size": 7936 00:29:29.445 }, 00:29:29.445 { 00:29:29.445 "name": "BaseBdev2", 00:29:29.445 "uuid": "9956c440-6eb4-4b9b-bcc7-77d38871a1c1", 00:29:29.445 "is_configured": true, 00:29:29.445 "data_offset": 256, 00:29:29.445 "data_size": 7936 00:29:29.445 } 00:29:29.445 ] 00:29:29.445 }' 00:29:29.445 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:29.445 12:10:42 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:30.118 12:10:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:29:30.118 12:10:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:30.118 12:10:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.118 12:10:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:29:30.377 12:10:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:29:30.377 12:10:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:29:30.377 12:10:43 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:29:30.636 [2024-07-15 12:10:43.991635] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:29:30.637 [2024-07-15 12:10:43.991733] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:30.637 [2024-07-15 12:10:44.005128] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:30.637 [2024-07-15 12:10:44.005166] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:30.637 [2024-07-15 12:10:44.005177] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ce240 name Existed_Raid, state offline 00:29:30.637 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:29:30.637 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:29:30.637 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:29:30.637 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1605846 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1605846 ']' 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1605846 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1605846 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1605846' 00:29:30.896 killing process with pid 1605846 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1605846 00:29:30.896 [2024-07-15 12:10:44.337494] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:30.896 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1605846 00:29:30.896 [2024-07-15 12:10:44.338563] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:31.155 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:29:31.155 00:29:31.155 real 0m10.301s 00:29:31.155 user 0m18.722s 00:29:31.155 sys 0m2.036s 00:29:31.155 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:31.155 12:10:44 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:31.155 ************************************ 00:29:31.155 END TEST raid_state_function_test_sb_md_separate 00:29:31.155 ************************************ 00:29:31.155 12:10:44 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:31.155 12:10:44 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:29:31.155 12:10:44 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:29:31.155 12:10:44 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:31.155 12:10:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:31.155 ************************************ 00:29:31.155 START TEST raid_superblock_test_md_separate 00:29:31.155 ************************************ 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=1607406 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 1607406 /var/tmp/spdk-raid.sock 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1607406 ']' 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:31.155 12:10:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:31.156 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:31.156 12:10:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:31.156 12:10:44 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:31.156 [2024-07-15 12:10:44.689160] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:29:31.156 [2024-07-15 12:10:44.689225] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1607406 ] 00:29:31.415 [2024-07-15 12:10:44.816790] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:31.415 [2024-07-15 12:10:44.918551] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:31.415 [2024-07-15 12:10:44.981877] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:31.415 [2024-07-15 12:10:44.981918] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@862 -- # return 0 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:29:32.352 malloc1 00:29:32.352 12:10:45 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:32.612 [2024-07-15 12:10:46.083963] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:32.612 [2024-07-15 12:10:46.084011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:32.612 [2024-07-15 12:10:46.084033] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c5c40 00:29:32.612 [2024-07-15 12:10:46.084047] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:32.612 [2024-07-15 12:10:46.085657] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:32.612 [2024-07-15 12:10:46.085693] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:32.612 pt1 00:29:32.612 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:32.612 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:32.612 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:29:32.612 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:29:32.612 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:29:32.612 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:29:32.612 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:29:32.612 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:29:32.612 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:29:32.871 malloc2 00:29:32.871 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:33.130 [2024-07-15 12:10:46.571995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:33.130 [2024-07-15 12:10:46.572042] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:33.130 [2024-07-15 12:10:46.572061] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16be000 00:29:33.130 [2024-07-15 12:10:46.572073] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:33.130 [2024-07-15 12:10:46.573465] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:33.130 [2024-07-15 12:10:46.573491] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:33.130 pt2 00:29:33.130 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:29:33.130 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:29:33.130 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:29:33.389 [2024-07-15 12:10:46.812649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:33.389 [2024-07-15 12:10:46.813970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:33.389 [2024-07-15 12:10:46.814115] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x15ab2c0 00:29:33.389 [2024-07-15 12:10:46.814127] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:33.389 [2024-07-15 12:10:46.814199] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c0980 00:29:33.389 [2024-07-15 12:10:46.814315] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x15ab2c0 00:29:33.389 [2024-07-15 12:10:46.814325] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x15ab2c0 00:29:33.389 [2024-07-15 12:10:46.814395] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:33.389 12:10:46 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:33.647 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:33.647 "name": "raid_bdev1", 00:29:33.647 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:33.647 "strip_size_kb": 0, 00:29:33.647 "state": "online", 00:29:33.647 "raid_level": "raid1", 00:29:33.647 "superblock": true, 00:29:33.647 "num_base_bdevs": 2, 00:29:33.647 "num_base_bdevs_discovered": 2, 00:29:33.647 "num_base_bdevs_operational": 2, 00:29:33.647 "base_bdevs_list": [ 00:29:33.647 { 00:29:33.647 "name": "pt1", 00:29:33.647 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:33.647 "is_configured": true, 00:29:33.647 "data_offset": 256, 00:29:33.647 "data_size": 7936 00:29:33.647 }, 00:29:33.647 { 00:29:33.647 "name": "pt2", 00:29:33.647 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:33.647 "is_configured": true, 00:29:33.647 "data_offset": 256, 00:29:33.647 "data_size": 7936 00:29:33.647 } 00:29:33.647 ] 00:29:33.647 }' 00:29:33.647 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:33.647 12:10:47 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:34.214 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:29:34.214 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:34.214 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:34.214 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:34.214 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:34.214 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:34.214 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:34.214 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:34.474 [2024-07-15 12:10:47.967950] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:34.474 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:34.474 "name": "raid_bdev1", 00:29:34.474 "aliases": [ 00:29:34.474 "2edb6abc-be6a-478a-9003-eb057f75a3d9" 00:29:34.474 ], 00:29:34.474 "product_name": "Raid Volume", 00:29:34.474 "block_size": 4096, 00:29:34.474 "num_blocks": 7936, 00:29:34.474 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:34.474 "md_size": 32, 00:29:34.474 "md_interleave": false, 00:29:34.474 "dif_type": 0, 00:29:34.474 "assigned_rate_limits": { 00:29:34.474 "rw_ios_per_sec": 0, 00:29:34.474 "rw_mbytes_per_sec": 0, 00:29:34.474 "r_mbytes_per_sec": 0, 00:29:34.474 "w_mbytes_per_sec": 0 00:29:34.474 }, 00:29:34.474 "claimed": false, 00:29:34.474 "zoned": false, 00:29:34.474 "supported_io_types": { 00:29:34.474 "read": true, 00:29:34.474 "write": true, 00:29:34.474 "unmap": false, 00:29:34.474 "flush": false, 00:29:34.474 "reset": true, 00:29:34.474 "nvme_admin": false, 00:29:34.474 "nvme_io": false, 00:29:34.474 "nvme_io_md": false, 00:29:34.474 "write_zeroes": true, 00:29:34.474 "zcopy": false, 00:29:34.474 "get_zone_info": false, 00:29:34.474 "zone_management": false, 00:29:34.474 "zone_append": false, 00:29:34.474 "compare": false, 00:29:34.474 "compare_and_write": false, 00:29:34.474 "abort": false, 00:29:34.474 "seek_hole": false, 00:29:34.474 "seek_data": false, 00:29:34.474 "copy": false, 00:29:34.474 "nvme_iov_md": false 00:29:34.474 }, 00:29:34.474 "memory_domains": [ 00:29:34.474 { 00:29:34.474 "dma_device_id": "system", 00:29:34.474 "dma_device_type": 1 00:29:34.474 }, 00:29:34.474 { 00:29:34.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:34.474 "dma_device_type": 2 00:29:34.474 }, 00:29:34.474 { 00:29:34.474 "dma_device_id": "system", 00:29:34.474 "dma_device_type": 1 00:29:34.474 }, 00:29:34.474 { 00:29:34.474 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:34.474 "dma_device_type": 2 00:29:34.474 } 00:29:34.474 ], 00:29:34.474 "driver_specific": { 00:29:34.474 "raid": { 00:29:34.474 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:34.474 "strip_size_kb": 0, 00:29:34.474 "state": "online", 00:29:34.474 "raid_level": "raid1", 00:29:34.474 "superblock": true, 00:29:34.474 "num_base_bdevs": 2, 00:29:34.474 "num_base_bdevs_discovered": 2, 00:29:34.474 "num_base_bdevs_operational": 2, 00:29:34.474 "base_bdevs_list": [ 00:29:34.474 { 00:29:34.474 "name": "pt1", 00:29:34.474 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:34.474 "is_configured": true, 00:29:34.474 "data_offset": 256, 00:29:34.474 "data_size": 7936 00:29:34.474 }, 00:29:34.474 { 00:29:34.474 "name": "pt2", 00:29:34.474 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:34.474 "is_configured": true, 00:29:34.474 "data_offset": 256, 00:29:34.474 "data_size": 7936 00:29:34.474 } 00:29:34.474 ] 00:29:34.474 } 00:29:34.474 } 00:29:34.474 }' 00:29:34.474 12:10:47 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:34.474 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:34.474 pt2' 00:29:34.474 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:34.474 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:34.474 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:34.734 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:34.734 "name": "pt1", 00:29:34.734 "aliases": [ 00:29:34.734 "00000000-0000-0000-0000-000000000001" 00:29:34.734 ], 00:29:34.734 "product_name": "passthru", 00:29:34.734 "block_size": 4096, 00:29:34.734 "num_blocks": 8192, 00:29:34.734 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:34.734 "md_size": 32, 00:29:34.734 "md_interleave": false, 00:29:34.734 "dif_type": 0, 00:29:34.734 "assigned_rate_limits": { 00:29:34.734 "rw_ios_per_sec": 0, 00:29:34.734 "rw_mbytes_per_sec": 0, 00:29:34.734 "r_mbytes_per_sec": 0, 00:29:34.734 "w_mbytes_per_sec": 0 00:29:34.734 }, 00:29:34.734 "claimed": true, 00:29:34.734 "claim_type": "exclusive_write", 00:29:34.734 "zoned": false, 00:29:34.734 "supported_io_types": { 00:29:34.734 "read": true, 00:29:34.734 "write": true, 00:29:34.734 "unmap": true, 00:29:34.734 "flush": true, 00:29:34.734 "reset": true, 00:29:34.734 "nvme_admin": false, 00:29:34.734 "nvme_io": false, 00:29:34.734 "nvme_io_md": false, 00:29:34.734 "write_zeroes": true, 00:29:34.734 "zcopy": true, 00:29:34.734 "get_zone_info": false, 00:29:34.734 "zone_management": false, 00:29:34.734 "zone_append": false, 00:29:34.734 "compare": false, 00:29:34.734 "compare_and_write": false, 00:29:34.734 "abort": true, 00:29:34.734 "seek_hole": false, 00:29:34.734 "seek_data": false, 00:29:34.734 "copy": true, 00:29:34.734 "nvme_iov_md": false 00:29:34.734 }, 00:29:34.734 "memory_domains": [ 00:29:34.734 { 00:29:34.734 "dma_device_id": "system", 00:29:34.734 "dma_device_type": 1 00:29:34.734 }, 00:29:34.734 { 00:29:34.734 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:34.734 "dma_device_type": 2 00:29:34.734 } 00:29:34.734 ], 00:29:34.734 "driver_specific": { 00:29:34.734 "passthru": { 00:29:34.734 "name": "pt1", 00:29:34.734 "base_bdev_name": "malloc1" 00:29:34.734 } 00:29:34.734 } 00:29:34.734 }' 00:29:34.734 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:34.993 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:34.993 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:34.993 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:34.993 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:34.993 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:34.993 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:34.993 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:34.993 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:34.993 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:35.253 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:35.253 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:35.253 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:35.253 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:35.253 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:35.253 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:35.253 "name": "pt2", 00:29:35.253 "aliases": [ 00:29:35.253 "00000000-0000-0000-0000-000000000002" 00:29:35.253 ], 00:29:35.253 "product_name": "passthru", 00:29:35.253 "block_size": 4096, 00:29:35.253 "num_blocks": 8192, 00:29:35.253 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:35.253 "md_size": 32, 00:29:35.253 "md_interleave": false, 00:29:35.253 "dif_type": 0, 00:29:35.253 "assigned_rate_limits": { 00:29:35.253 "rw_ios_per_sec": 0, 00:29:35.253 "rw_mbytes_per_sec": 0, 00:29:35.253 "r_mbytes_per_sec": 0, 00:29:35.253 "w_mbytes_per_sec": 0 00:29:35.253 }, 00:29:35.253 "claimed": true, 00:29:35.253 "claim_type": "exclusive_write", 00:29:35.253 "zoned": false, 00:29:35.253 "supported_io_types": { 00:29:35.253 "read": true, 00:29:35.253 "write": true, 00:29:35.253 "unmap": true, 00:29:35.253 "flush": true, 00:29:35.253 "reset": true, 00:29:35.253 "nvme_admin": false, 00:29:35.253 "nvme_io": false, 00:29:35.253 "nvme_io_md": false, 00:29:35.253 "write_zeroes": true, 00:29:35.253 "zcopy": true, 00:29:35.253 "get_zone_info": false, 00:29:35.253 "zone_management": false, 00:29:35.253 "zone_append": false, 00:29:35.253 "compare": false, 00:29:35.253 "compare_and_write": false, 00:29:35.253 "abort": true, 00:29:35.253 "seek_hole": false, 00:29:35.253 "seek_data": false, 00:29:35.253 "copy": true, 00:29:35.253 "nvme_iov_md": false 00:29:35.253 }, 00:29:35.253 "memory_domains": [ 00:29:35.253 { 00:29:35.253 "dma_device_id": "system", 00:29:35.253 "dma_device_type": 1 00:29:35.253 }, 00:29:35.253 { 00:29:35.253 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:35.253 "dma_device_type": 2 00:29:35.253 } 00:29:35.253 ], 00:29:35.253 "driver_specific": { 00:29:35.253 "passthru": { 00:29:35.253 "name": "pt2", 00:29:35.253 "base_bdev_name": "malloc2" 00:29:35.253 } 00:29:35.253 } 00:29:35.253 }' 00:29:35.253 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.512 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:35.512 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:35.512 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.512 12:10:48 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:35.512 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:35.512 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:35.512 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:35.512 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:35.512 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:35.770 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:35.770 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:35.770 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:35.770 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:29:35.770 [2024-07-15 12:10:49.347588] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:36.029 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2edb6abc-be6a-478a-9003-eb057f75a3d9 00:29:36.029 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 2edb6abc-be6a-478a-9003-eb057f75a3d9 ']' 00:29:36.029 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:36.029 [2024-07-15 12:10:49.527828] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:36.029 [2024-07-15 12:10:49.527847] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:36.029 [2024-07-15 12:10:49.527899] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:36.029 [2024-07-15 12:10:49.527956] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:36.029 [2024-07-15 12:10:49.527967] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x15ab2c0 name raid_bdev1, state offline 00:29:36.029 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:36.029 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:29:36.288 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:29:36.288 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:29:36.288 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:36.288 12:10:49 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:36.547 12:10:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:29:36.547 12:10:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:37.116 12:10:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:29:37.116 12:10:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:29:37.376 12:10:50 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:29:37.636 [2024-07-15 12:10:51.051797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:29:37.636 [2024-07-15 12:10:51.053177] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:29:37.636 [2024-07-15 12:10:51.053232] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:29:37.636 [2024-07-15 12:10:51.053270] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:29:37.636 [2024-07-15 12:10:51.053288] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:37.636 [2024-07-15 12:10:51.053298] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1528dd0 name raid_bdev1, state configuring 00:29:37.636 request: 00:29:37.636 { 00:29:37.636 "name": "raid_bdev1", 00:29:37.636 "raid_level": "raid1", 00:29:37.636 "base_bdevs": [ 00:29:37.636 "malloc1", 00:29:37.636 "malloc2" 00:29:37.636 ], 00:29:37.636 "superblock": false, 00:29:37.636 "method": "bdev_raid_create", 00:29:37.636 "req_id": 1 00:29:37.636 } 00:29:37.636 Got JSON-RPC error response 00:29:37.636 response: 00:29:37.636 { 00:29:37.636 "code": -17, 00:29:37.636 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:29:37.636 } 00:29:37.636 12:10:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # es=1 00:29:37.636 12:10:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:29:37.636 12:10:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:29:37.636 12:10:51 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:29:37.636 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:37.636 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:29:37.895 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:29:37.895 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:29:37.895 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:38.153 [2024-07-15 12:10:51.561076] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:38.153 [2024-07-15 12:10:51.561118] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:38.153 [2024-07-15 12:10:51.561136] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c0920 00:29:38.153 [2024-07-15 12:10:51.561147] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:38.153 [2024-07-15 12:10:51.562570] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:38.153 [2024-07-15 12:10:51.562596] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:38.153 [2024-07-15 12:10:51.562639] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:38.153 [2024-07-15 12:10:51.562663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:38.153 pt1 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:38.153 12:10:51 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:38.719 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:38.719 "name": "raid_bdev1", 00:29:38.719 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:38.719 "strip_size_kb": 0, 00:29:38.719 "state": "configuring", 00:29:38.719 "raid_level": "raid1", 00:29:38.719 "superblock": true, 00:29:38.719 "num_base_bdevs": 2, 00:29:38.719 "num_base_bdevs_discovered": 1, 00:29:38.719 "num_base_bdevs_operational": 2, 00:29:38.719 "base_bdevs_list": [ 00:29:38.719 { 00:29:38.719 "name": "pt1", 00:29:38.719 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:38.719 "is_configured": true, 00:29:38.719 "data_offset": 256, 00:29:38.719 "data_size": 7936 00:29:38.720 }, 00:29:38.720 { 00:29:38.720 "name": null, 00:29:38.720 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:38.720 "is_configured": false, 00:29:38.720 "data_offset": 256, 00:29:38.720 "data_size": 7936 00:29:38.720 } 00:29:38.720 ] 00:29:38.720 }' 00:29:38.720 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:38.720 12:10:52 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:39.285 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:29:39.285 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:29:39.285 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:39.285 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:39.543 [2024-07-15 12:10:52.920692] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:39.543 [2024-07-15 12:10:52.920741] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:39.543 [2024-07-15 12:10:52.920760] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c1150 00:29:39.543 [2024-07-15 12:10:52.920773] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:39.543 [2024-07-15 12:10:52.920966] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:39.543 [2024-07-15 12:10:52.920982] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:39.543 [2024-07-15 12:10:52.921024] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:39.543 [2024-07-15 12:10:52.921042] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:39.543 [2024-07-15 12:10:52.921130] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c16f0 00:29:39.543 [2024-07-15 12:10:52.921140] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:39.544 [2024-07-15 12:10:52.921199] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c2530 00:29:39.544 [2024-07-15 12:10:52.921300] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c16f0 00:29:39.544 [2024-07-15 12:10:52.921310] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16c16f0 00:29:39.544 [2024-07-15 12:10:52.921379] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:39.544 pt2 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:39.544 12:10:52 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:39.813 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:39.813 "name": "raid_bdev1", 00:29:39.813 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:39.813 "strip_size_kb": 0, 00:29:39.813 "state": "online", 00:29:39.813 "raid_level": "raid1", 00:29:39.813 "superblock": true, 00:29:39.813 "num_base_bdevs": 2, 00:29:39.813 "num_base_bdevs_discovered": 2, 00:29:39.813 "num_base_bdevs_operational": 2, 00:29:39.813 "base_bdevs_list": [ 00:29:39.813 { 00:29:39.813 "name": "pt1", 00:29:39.813 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:39.813 "is_configured": true, 00:29:39.813 "data_offset": 256, 00:29:39.813 "data_size": 7936 00:29:39.813 }, 00:29:39.813 { 00:29:39.813 "name": "pt2", 00:29:39.813 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:39.813 "is_configured": true, 00:29:39.813 "data_offset": 256, 00:29:39.813 "data_size": 7936 00:29:39.813 } 00:29:39.813 ] 00:29:39.813 }' 00:29:39.813 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:39.813 12:10:53 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:40.382 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:29:40.382 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:29:40.382 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:29:40.382 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:29:40.382 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:29:40.382 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:29:40.382 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:29:40.382 12:10:53 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:40.641 [2024-07-15 12:10:54.031883] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:40.641 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:29:40.641 "name": "raid_bdev1", 00:29:40.641 "aliases": [ 00:29:40.641 "2edb6abc-be6a-478a-9003-eb057f75a3d9" 00:29:40.641 ], 00:29:40.641 "product_name": "Raid Volume", 00:29:40.641 "block_size": 4096, 00:29:40.641 "num_blocks": 7936, 00:29:40.641 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:40.641 "md_size": 32, 00:29:40.641 "md_interleave": false, 00:29:40.641 "dif_type": 0, 00:29:40.641 "assigned_rate_limits": { 00:29:40.641 "rw_ios_per_sec": 0, 00:29:40.641 "rw_mbytes_per_sec": 0, 00:29:40.641 "r_mbytes_per_sec": 0, 00:29:40.641 "w_mbytes_per_sec": 0 00:29:40.641 }, 00:29:40.641 "claimed": false, 00:29:40.641 "zoned": false, 00:29:40.641 "supported_io_types": { 00:29:40.641 "read": true, 00:29:40.641 "write": true, 00:29:40.641 "unmap": false, 00:29:40.641 "flush": false, 00:29:40.641 "reset": true, 00:29:40.641 "nvme_admin": false, 00:29:40.641 "nvme_io": false, 00:29:40.641 "nvme_io_md": false, 00:29:40.641 "write_zeroes": true, 00:29:40.641 "zcopy": false, 00:29:40.641 "get_zone_info": false, 00:29:40.641 "zone_management": false, 00:29:40.641 "zone_append": false, 00:29:40.641 "compare": false, 00:29:40.641 "compare_and_write": false, 00:29:40.641 "abort": false, 00:29:40.641 "seek_hole": false, 00:29:40.641 "seek_data": false, 00:29:40.641 "copy": false, 00:29:40.641 "nvme_iov_md": false 00:29:40.641 }, 00:29:40.641 "memory_domains": [ 00:29:40.641 { 00:29:40.641 "dma_device_id": "system", 00:29:40.641 "dma_device_type": 1 00:29:40.641 }, 00:29:40.641 { 00:29:40.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.641 "dma_device_type": 2 00:29:40.641 }, 00:29:40.641 { 00:29:40.641 "dma_device_id": "system", 00:29:40.641 "dma_device_type": 1 00:29:40.641 }, 00:29:40.641 { 00:29:40.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.641 "dma_device_type": 2 00:29:40.641 } 00:29:40.641 ], 00:29:40.641 "driver_specific": { 00:29:40.641 "raid": { 00:29:40.641 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:40.641 "strip_size_kb": 0, 00:29:40.641 "state": "online", 00:29:40.641 "raid_level": "raid1", 00:29:40.641 "superblock": true, 00:29:40.641 "num_base_bdevs": 2, 00:29:40.641 "num_base_bdevs_discovered": 2, 00:29:40.641 "num_base_bdevs_operational": 2, 00:29:40.641 "base_bdevs_list": [ 00:29:40.641 { 00:29:40.641 "name": "pt1", 00:29:40.641 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:40.641 "is_configured": true, 00:29:40.641 "data_offset": 256, 00:29:40.641 "data_size": 7936 00:29:40.641 }, 00:29:40.641 { 00:29:40.641 "name": "pt2", 00:29:40.641 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:40.641 "is_configured": true, 00:29:40.641 "data_offset": 256, 00:29:40.641 "data_size": 7936 00:29:40.641 } 00:29:40.641 ] 00:29:40.641 } 00:29:40.641 } 00:29:40.641 }' 00:29:40.641 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:29:40.641 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:29:40.641 pt2' 00:29:40.641 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:40.641 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:29:40.641 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:40.900 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:40.900 "name": "pt1", 00:29:40.900 "aliases": [ 00:29:40.900 "00000000-0000-0000-0000-000000000001" 00:29:40.900 ], 00:29:40.900 "product_name": "passthru", 00:29:40.900 "block_size": 4096, 00:29:40.900 "num_blocks": 8192, 00:29:40.900 "uuid": "00000000-0000-0000-0000-000000000001", 00:29:40.900 "md_size": 32, 00:29:40.900 "md_interleave": false, 00:29:40.900 "dif_type": 0, 00:29:40.900 "assigned_rate_limits": { 00:29:40.900 "rw_ios_per_sec": 0, 00:29:40.900 "rw_mbytes_per_sec": 0, 00:29:40.900 "r_mbytes_per_sec": 0, 00:29:40.900 "w_mbytes_per_sec": 0 00:29:40.900 }, 00:29:40.900 "claimed": true, 00:29:40.900 "claim_type": "exclusive_write", 00:29:40.900 "zoned": false, 00:29:40.900 "supported_io_types": { 00:29:40.900 "read": true, 00:29:40.900 "write": true, 00:29:40.900 "unmap": true, 00:29:40.900 "flush": true, 00:29:40.900 "reset": true, 00:29:40.900 "nvme_admin": false, 00:29:40.900 "nvme_io": false, 00:29:40.900 "nvme_io_md": false, 00:29:40.900 "write_zeroes": true, 00:29:40.900 "zcopy": true, 00:29:40.900 "get_zone_info": false, 00:29:40.900 "zone_management": false, 00:29:40.900 "zone_append": false, 00:29:40.900 "compare": false, 00:29:40.900 "compare_and_write": false, 00:29:40.900 "abort": true, 00:29:40.900 "seek_hole": false, 00:29:40.900 "seek_data": false, 00:29:40.900 "copy": true, 00:29:40.900 "nvme_iov_md": false 00:29:40.900 }, 00:29:40.900 "memory_domains": [ 00:29:40.900 { 00:29:40.900 "dma_device_id": "system", 00:29:40.900 "dma_device_type": 1 00:29:40.900 }, 00:29:40.900 { 00:29:40.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:40.900 "dma_device_type": 2 00:29:40.900 } 00:29:40.900 ], 00:29:40.900 "driver_specific": { 00:29:40.900 "passthru": { 00:29:40.900 "name": "pt1", 00:29:40.900 "base_bdev_name": "malloc1" 00:29:40.900 } 00:29:40.900 } 00:29:40.900 }' 00:29:40.900 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:40.900 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:40.900 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:40.900 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:40.900 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:29:41.158 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:29:41.417 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:29:41.417 "name": "pt2", 00:29:41.417 "aliases": [ 00:29:41.417 "00000000-0000-0000-0000-000000000002" 00:29:41.417 ], 00:29:41.417 "product_name": "passthru", 00:29:41.417 "block_size": 4096, 00:29:41.417 "num_blocks": 8192, 00:29:41.417 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:41.417 "md_size": 32, 00:29:41.417 "md_interleave": false, 00:29:41.417 "dif_type": 0, 00:29:41.417 "assigned_rate_limits": { 00:29:41.417 "rw_ios_per_sec": 0, 00:29:41.417 "rw_mbytes_per_sec": 0, 00:29:41.417 "r_mbytes_per_sec": 0, 00:29:41.417 "w_mbytes_per_sec": 0 00:29:41.417 }, 00:29:41.417 "claimed": true, 00:29:41.417 "claim_type": "exclusive_write", 00:29:41.417 "zoned": false, 00:29:41.417 "supported_io_types": { 00:29:41.417 "read": true, 00:29:41.417 "write": true, 00:29:41.417 "unmap": true, 00:29:41.417 "flush": true, 00:29:41.417 "reset": true, 00:29:41.417 "nvme_admin": false, 00:29:41.417 "nvme_io": false, 00:29:41.417 "nvme_io_md": false, 00:29:41.417 "write_zeroes": true, 00:29:41.417 "zcopy": true, 00:29:41.417 "get_zone_info": false, 00:29:41.417 "zone_management": false, 00:29:41.417 "zone_append": false, 00:29:41.417 "compare": false, 00:29:41.417 "compare_and_write": false, 00:29:41.417 "abort": true, 00:29:41.417 "seek_hole": false, 00:29:41.417 "seek_data": false, 00:29:41.417 "copy": true, 00:29:41.417 "nvme_iov_md": false 00:29:41.417 }, 00:29:41.417 "memory_domains": [ 00:29:41.417 { 00:29:41.417 "dma_device_id": "system", 00:29:41.417 "dma_device_type": 1 00:29:41.417 }, 00:29:41.417 { 00:29:41.417 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:29:41.417 "dma_device_type": 2 00:29:41.417 } 00:29:41.417 ], 00:29:41.417 "driver_specific": { 00:29:41.417 "passthru": { 00:29:41.417 "name": "pt2", 00:29:41.417 "base_bdev_name": "malloc2" 00:29:41.417 } 00:29:41.417 } 00:29:41.417 }' 00:29:41.417 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.417 12:10:54 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:29:41.675 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:29:41.675 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.675 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:29:41.676 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:29:41.676 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.676 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:29:41.676 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:29:41.676 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.676 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:29:41.676 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:29:41.934 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:41.934 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:29:41.934 [2024-07-15 12:10:55.499754] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:41.934 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 2edb6abc-be6a-478a-9003-eb057f75a3d9 '!=' 2edb6abc-be6a-478a-9003-eb057f75a3d9 ']' 00:29:41.934 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:29:41.934 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:29:41.934 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:29:41.934 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:29:42.191 [2024-07-15 12:10:55.752206] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:42.191 12:10:55 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:42.450 12:10:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:42.450 "name": "raid_bdev1", 00:29:42.450 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:42.450 "strip_size_kb": 0, 00:29:42.450 "state": "online", 00:29:42.450 "raid_level": "raid1", 00:29:42.450 "superblock": true, 00:29:42.450 "num_base_bdevs": 2, 00:29:42.450 "num_base_bdevs_discovered": 1, 00:29:42.450 "num_base_bdevs_operational": 1, 00:29:42.450 "base_bdevs_list": [ 00:29:42.450 { 00:29:42.450 "name": null, 00:29:42.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:42.450 "is_configured": false, 00:29:42.450 "data_offset": 256, 00:29:42.450 "data_size": 7936 00:29:42.450 }, 00:29:42.450 { 00:29:42.450 "name": "pt2", 00:29:42.450 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:42.450 "is_configured": true, 00:29:42.450 "data_offset": 256, 00:29:42.450 "data_size": 7936 00:29:42.450 } 00:29:42.450 ] 00:29:42.450 }' 00:29:42.450 12:10:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:42.450 12:10:56 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:43.385 12:10:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:43.385 [2024-07-15 12:10:56.835031] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:43.385 [2024-07-15 12:10:56.835055] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:43.385 [2024-07-15 12:10:56.835104] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:43.385 [2024-07-15 12:10:56.835148] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:43.385 [2024-07-15 12:10:56.835159] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c16f0 name raid_bdev1, state offline 00:29:43.385 12:10:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:43.385 12:10:56 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:29:43.952 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:29:43.952 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:29:43.952 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:29:43.952 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:43.952 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:29:44.210 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:29:44.210 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:29:44.210 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:29:44.210 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:29:44.210 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:29:44.210 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:29:44.469 [2024-07-15 12:10:57.833616] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:29:44.469 [2024-07-15 12:10:57.833658] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:44.469 [2024-07-15 12:10:57.833674] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16bd120 00:29:44.469 [2024-07-15 12:10:57.833691] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:44.469 [2024-07-15 12:10:57.835131] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:44.469 [2024-07-15 12:10:57.835156] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:29:44.469 [2024-07-15 12:10:57.835205] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:29:44.469 [2024-07-15 12:10:57.835230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:44.469 [2024-07-15 12:10:57.835303] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16be3d0 00:29:44.469 [2024-07-15 12:10:57.835313] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:44.469 [2024-07-15 12:10:57.835366] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c2060 00:29:44.469 [2024-07-15 12:10:57.835462] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16be3d0 00:29:44.469 [2024-07-15 12:10:57.835472] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16be3d0 00:29:44.469 [2024-07-15 12:10:57.835536] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:44.469 pt2 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:44.469 12:10:57 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:44.728 12:10:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:44.728 "name": "raid_bdev1", 00:29:44.728 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:44.728 "strip_size_kb": 0, 00:29:44.728 "state": "online", 00:29:44.728 "raid_level": "raid1", 00:29:44.728 "superblock": true, 00:29:44.728 "num_base_bdevs": 2, 00:29:44.728 "num_base_bdevs_discovered": 1, 00:29:44.728 "num_base_bdevs_operational": 1, 00:29:44.728 "base_bdevs_list": [ 00:29:44.728 { 00:29:44.728 "name": null, 00:29:44.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:44.728 "is_configured": false, 00:29:44.728 "data_offset": 256, 00:29:44.728 "data_size": 7936 00:29:44.728 }, 00:29:44.728 { 00:29:44.728 "name": "pt2", 00:29:44.728 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:44.728 "is_configured": true, 00:29:44.728 "data_offset": 256, 00:29:44.728 "data_size": 7936 00:29:44.728 } 00:29:44.728 ] 00:29:44.728 }' 00:29:44.728 12:10:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:44.728 12:10:58 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:45.294 12:10:58 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:29:45.553 [2024-07-15 12:10:58.996681] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:45.553 [2024-07-15 12:10:58.996710] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:29:45.553 [2024-07-15 12:10:58.996758] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:45.553 [2024-07-15 12:10:58.996800] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:45.553 [2024-07-15 12:10:58.996812] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16be3d0 name raid_bdev1, state offline 00:29:45.553 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:45.553 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:29:45.811 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:29:45.811 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:29:45.811 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:29:45.811 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:29:46.069 [2024-07-15 12:10:59.493974] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:29:46.069 [2024-07-15 12:10:59.494019] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:46.069 [2024-07-15 12:10:59.494036] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1529ac0 00:29:46.069 [2024-07-15 12:10:59.494047] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:46.069 [2024-07-15 12:10:59.495489] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:46.069 [2024-07-15 12:10:59.495515] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:29:46.069 [2024-07-15 12:10:59.495557] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:29:46.069 [2024-07-15 12:10:59.495580] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:29:46.069 [2024-07-15 12:10:59.495669] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:29:46.069 [2024-07-15 12:10:59.495681] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:29:46.069 [2024-07-15 12:10:59.495704] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c5200 name raid_bdev1, state configuring 00:29:46.069 [2024-07-15 12:10:59.495727] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:29:46.069 [2024-07-15 12:10:59.495778] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c2e20 00:29:46.069 [2024-07-15 12:10:59.495788] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:46.069 [2024-07-15 12:10:59.495841] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c38b0 00:29:46.069 [2024-07-15 12:10:59.495935] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c2e20 00:29:46.069 [2024-07-15 12:10:59.495944] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16c2e20 00:29:46.069 [2024-07-15 12:10:59.496015] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:46.069 pt1 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:46.069 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:46.327 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:46.327 "name": "raid_bdev1", 00:29:46.327 "uuid": "2edb6abc-be6a-478a-9003-eb057f75a3d9", 00:29:46.327 "strip_size_kb": 0, 00:29:46.327 "state": "online", 00:29:46.327 "raid_level": "raid1", 00:29:46.327 "superblock": true, 00:29:46.327 "num_base_bdevs": 2, 00:29:46.327 "num_base_bdevs_discovered": 1, 00:29:46.327 "num_base_bdevs_operational": 1, 00:29:46.327 "base_bdevs_list": [ 00:29:46.327 { 00:29:46.327 "name": null, 00:29:46.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:46.327 "is_configured": false, 00:29:46.327 "data_offset": 256, 00:29:46.327 "data_size": 7936 00:29:46.327 }, 00:29:46.327 { 00:29:46.327 "name": "pt2", 00:29:46.327 "uuid": "00000000-0000-0000-0000-000000000002", 00:29:46.327 "is_configured": true, 00:29:46.328 "data_offset": 256, 00:29:46.328 "data_size": 7936 00:29:46.328 } 00:29:46.328 ] 00:29:46.328 }' 00:29:46.328 12:10:59 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:46.328 12:10:59 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:47.262 12:11:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:29:47.262 12:11:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:29:47.520 12:11:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:29:47.520 12:11:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:47.520 12:11:00 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:29:48.085 [2024-07-15 12:11:01.407267] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 2edb6abc-be6a-478a-9003-eb057f75a3d9 '!=' 2edb6abc-be6a-478a-9003-eb057f75a3d9 ']' 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 1607406 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1607406 ']' 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@952 -- # kill -0 1607406 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # uname 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1607406 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1607406' 00:29:48.085 killing process with pid 1607406 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # kill 1607406 00:29:48.085 [2024-07-15 12:11:01.490730] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:29:48.085 [2024-07-15 12:11:01.490790] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:29:48.085 [2024-07-15 12:11:01.490837] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:29:48.085 [2024-07-15 12:11:01.490849] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c2e20 name raid_bdev1, state offline 00:29:48.085 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@972 -- # wait 1607406 00:29:48.085 [2024-07-15 12:11:01.516867] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:29:48.344 12:11:01 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:29:48.344 00:29:48.344 real 0m17.116s 00:29:48.344 user 0m31.027s 00:29:48.344 sys 0m3.160s 00:29:48.344 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:29:48.344 12:11:01 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:48.344 ************************************ 00:29:48.344 END TEST raid_superblock_test_md_separate 00:29:48.344 ************************************ 00:29:48.344 12:11:01 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:29:48.344 12:11:01 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:29:48.344 12:11:01 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:29:48.344 12:11:01 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:29:48.344 12:11:01 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:29:48.344 12:11:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:29:48.344 ************************************ 00:29:48.344 START TEST raid_rebuild_test_sb_md_separate 00:29:48.344 ************************************ 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false true 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=1609881 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 1609881 /var/tmp/spdk-raid.sock 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@829 -- # '[' -z 1609881 ']' 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local max_retries=100 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:29:48.344 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # xtrace_disable 00:29:48.344 12:11:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:48.344 [2024-07-15 12:11:01.897361] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:29:48.344 [2024-07-15 12:11:01.897442] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1609881 ] 00:29:48.344 I/O size of 3145728 is greater than zero copy threshold (65536). 00:29:48.344 Zero copy mechanism will not be used. 00:29:48.602 [2024-07-15 12:11:02.042551] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:29:48.602 [2024-07-15 12:11:02.152274] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:48.860 [2024-07-15 12:11:02.210091] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:48.860 [2024-07-15 12:11:02.210116] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:29:48.860 12:11:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:29:48.860 12:11:02 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@862 -- # return 0 00:29:48.860 12:11:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:48.860 12:11:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:29:49.427 BaseBdev1_malloc 00:29:49.427 12:11:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:29:49.686 [2024-07-15 12:11:03.103664] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:29:49.686 [2024-07-15 12:11:03.103719] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:49.686 [2024-07-15 12:11:03.103744] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1310280 00:29:49.686 [2024-07-15 12:11:03.103757] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:49.686 [2024-07-15 12:11:03.105248] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:49.686 [2024-07-15 12:11:03.105275] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:29:49.686 BaseBdev1 00:29:49.686 12:11:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:29:49.686 12:11:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:29:49.945 BaseBdev2_malloc 00:29:49.945 12:11:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:29:50.203 [2024-07-15 12:11:03.603481] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:29:50.203 [2024-07-15 12:11:03.603523] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:50.203 [2024-07-15 12:11:03.603544] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14a20b0 00:29:50.203 [2024-07-15 12:11:03.603563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:50.203 [2024-07-15 12:11:03.605006] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:50.203 [2024-07-15 12:11:03.605034] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:29:50.203 BaseBdev2 00:29:50.203 12:11:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:29:50.771 spare_malloc 00:29:50.771 12:11:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:29:51.030 spare_delay 00:29:51.030 12:11:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:29:51.288 [2024-07-15 12:11:04.881490] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:29:51.289 [2024-07-15 12:11:04.881537] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:29:51.289 [2024-07-15 12:11:04.881559] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14a4750 00:29:51.289 [2024-07-15 12:11:04.881572] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:29:51.289 [2024-07-15 12:11:04.883006] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:29:51.289 [2024-07-15 12:11:04.883032] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:29:51.547 spare 00:29:51.547 12:11:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:29:51.806 [2024-07-15 12:11:05.390838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:29:51.806 [2024-07-15 12:11:05.392188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:29:51.806 [2024-07-15 12:11:05.392353] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14a6340 00:29:51.806 [2024-07-15 12:11:05.392365] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:29:51.806 [2024-07-15 12:11:05.392444] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1310b70 00:29:51.806 [2024-07-15 12:11:05.392555] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14a6340 00:29:51.806 [2024-07-15 12:11:05.392564] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14a6340 00:29:51.806 [2024-07-15 12:11:05.392633] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:52.066 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:52.634 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:52.634 "name": "raid_bdev1", 00:29:52.634 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:29:52.634 "strip_size_kb": 0, 00:29:52.634 "state": "online", 00:29:52.634 "raid_level": "raid1", 00:29:52.634 "superblock": true, 00:29:52.634 "num_base_bdevs": 2, 00:29:52.634 "num_base_bdevs_discovered": 2, 00:29:52.634 "num_base_bdevs_operational": 2, 00:29:52.634 "base_bdevs_list": [ 00:29:52.634 { 00:29:52.634 "name": "BaseBdev1", 00:29:52.634 "uuid": "960a78b4-460b-5c3d-a318-710a01a16c1d", 00:29:52.634 "is_configured": true, 00:29:52.634 "data_offset": 256, 00:29:52.634 "data_size": 7936 00:29:52.634 }, 00:29:52.634 { 00:29:52.634 "name": "BaseBdev2", 00:29:52.634 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:29:52.634 "is_configured": true, 00:29:52.634 "data_offset": 256, 00:29:52.634 "data_size": 7936 00:29:52.634 } 00:29:52.634 ] 00:29:52.634 }' 00:29:52.634 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:52.634 12:11:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:53.571 12:11:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:29:53.571 12:11:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:29:53.571 [2024-07-15 12:11:07.035423] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:29:53.571 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:29:53.571 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:53.571 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:29:53.830 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:29:53.831 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:53.831 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:29:54.090 [2024-07-15 12:11:07.520494] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1310b70 00:29:54.090 /dev/nbd0 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:29:54.090 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:29:54.091 1+0 records in 00:29:54.091 1+0 records out 00:29:54.091 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263158 s, 15.6 MB/s 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:29:54.091 12:11:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:29:55.082 7936+0 records in 00:29:55.082 7936+0 records out 00:29:55.082 32505856 bytes (33 MB, 31 MiB) copied, 0.763429 s, 42.6 MB/s 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:29:55.082 [2024-07-15 12:11:08.616345] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:29:55.082 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:29:55.341 [2024-07-15 12:11:08.849005] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:55.341 12:11:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:55.600 12:11:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:55.600 "name": "raid_bdev1", 00:29:55.600 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:29:55.600 "strip_size_kb": 0, 00:29:55.600 "state": "online", 00:29:55.600 "raid_level": "raid1", 00:29:55.600 "superblock": true, 00:29:55.600 "num_base_bdevs": 2, 00:29:55.600 "num_base_bdevs_discovered": 1, 00:29:55.600 "num_base_bdevs_operational": 1, 00:29:55.600 "base_bdevs_list": [ 00:29:55.600 { 00:29:55.600 "name": null, 00:29:55.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:55.601 "is_configured": false, 00:29:55.601 "data_offset": 256, 00:29:55.601 "data_size": 7936 00:29:55.601 }, 00:29:55.601 { 00:29:55.601 "name": "BaseBdev2", 00:29:55.601 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:29:55.601 "is_configured": true, 00:29:55.601 "data_offset": 256, 00:29:55.601 "data_size": 7936 00:29:55.601 } 00:29:55.601 ] 00:29:55.601 }' 00:29:55.601 12:11:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:55.601 12:11:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:56.169 12:11:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:56.428 [2024-07-15 12:11:09.959982] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:56.428 [2024-07-15 12:11:09.962302] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a6f50 00:29:56.428 [2024-07-15 12:11:09.964595] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:56.428 12:11:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:29:57.806 12:11:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:29:57.806 12:11:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:57.806 12:11:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:29:57.806 12:11:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:29:57.806 12:11:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:57.806 12:11:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:57.806 12:11:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:57.806 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:57.806 "name": "raid_bdev1", 00:29:57.806 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:29:57.806 "strip_size_kb": 0, 00:29:57.806 "state": "online", 00:29:57.806 "raid_level": "raid1", 00:29:57.806 "superblock": true, 00:29:57.806 "num_base_bdevs": 2, 00:29:57.806 "num_base_bdevs_discovered": 2, 00:29:57.806 "num_base_bdevs_operational": 2, 00:29:57.806 "process": { 00:29:57.806 "type": "rebuild", 00:29:57.806 "target": "spare", 00:29:57.806 "progress": { 00:29:57.806 "blocks": 2816, 00:29:57.806 "percent": 35 00:29:57.806 } 00:29:57.806 }, 00:29:57.806 "base_bdevs_list": [ 00:29:57.806 { 00:29:57.806 "name": "spare", 00:29:57.806 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:29:57.806 "is_configured": true, 00:29:57.806 "data_offset": 256, 00:29:57.806 "data_size": 7936 00:29:57.806 }, 00:29:57.806 { 00:29:57.806 "name": "BaseBdev2", 00:29:57.806 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:29:57.806 "is_configured": true, 00:29:57.806 "data_offset": 256, 00:29:57.806 "data_size": 7936 00:29:57.806 } 00:29:57.806 ] 00:29:57.806 }' 00:29:57.806 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:57.806 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:29:57.806 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:57.806 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:29:57.806 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:29:58.065 [2024-07-15 12:11:11.497895] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:58.065 [2024-07-15 12:11:11.577258] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:29:58.065 [2024-07-15 12:11:11.577303] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:29:58.065 [2024-07-15 12:11:11.577317] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:29:58.065 [2024-07-15 12:11:11.577325] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:58.065 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:58.324 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:29:58.324 "name": "raid_bdev1", 00:29:58.324 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:29:58.324 "strip_size_kb": 0, 00:29:58.324 "state": "online", 00:29:58.324 "raid_level": "raid1", 00:29:58.324 "superblock": true, 00:29:58.324 "num_base_bdevs": 2, 00:29:58.324 "num_base_bdevs_discovered": 1, 00:29:58.324 "num_base_bdevs_operational": 1, 00:29:58.324 "base_bdevs_list": [ 00:29:58.324 { 00:29:58.324 "name": null, 00:29:58.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:58.324 "is_configured": false, 00:29:58.324 "data_offset": 256, 00:29:58.324 "data_size": 7936 00:29:58.324 }, 00:29:58.324 { 00:29:58.324 "name": "BaseBdev2", 00:29:58.324 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:29:58.324 "is_configured": true, 00:29:58.324 "data_offset": 256, 00:29:58.324 "data_size": 7936 00:29:58.324 } 00:29:58.324 ] 00:29:58.324 }' 00:29:58.324 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:29:58.324 12:11:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:29:58.892 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:29:58.892 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:29:58.892 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:29:58.892 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:29:58.892 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:29:58.892 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:29:58.892 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:29:59.151 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:29:59.151 "name": "raid_bdev1", 00:29:59.151 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:29:59.151 "strip_size_kb": 0, 00:29:59.151 "state": "online", 00:29:59.151 "raid_level": "raid1", 00:29:59.151 "superblock": true, 00:29:59.151 "num_base_bdevs": 2, 00:29:59.151 "num_base_bdevs_discovered": 1, 00:29:59.151 "num_base_bdevs_operational": 1, 00:29:59.151 "base_bdevs_list": [ 00:29:59.151 { 00:29:59.151 "name": null, 00:29:59.151 "uuid": "00000000-0000-0000-0000-000000000000", 00:29:59.151 "is_configured": false, 00:29:59.152 "data_offset": 256, 00:29:59.152 "data_size": 7936 00:29:59.152 }, 00:29:59.152 { 00:29:59.152 "name": "BaseBdev2", 00:29:59.152 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:29:59.152 "is_configured": true, 00:29:59.152 "data_offset": 256, 00:29:59.152 "data_size": 7936 00:29:59.152 } 00:29:59.152 ] 00:29:59.152 }' 00:29:59.152 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:29:59.411 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:29:59.411 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:29:59.411 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:29:59.411 12:11:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:29:59.670 [2024-07-15 12:11:13.260834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:29:59.670 [2024-07-15 12:11:13.263129] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a8ec0 00:29:59.670 [2024-07-15 12:11:13.264583] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:29:59.928 12:11:13 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:30:00.864 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:00.864 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:00.864 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:00.864 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:00.864 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:00.864 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:00.864 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:01.123 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:01.123 "name": "raid_bdev1", 00:30:01.124 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:01.124 "strip_size_kb": 0, 00:30:01.124 "state": "online", 00:30:01.124 "raid_level": "raid1", 00:30:01.124 "superblock": true, 00:30:01.124 "num_base_bdevs": 2, 00:30:01.124 "num_base_bdevs_discovered": 2, 00:30:01.124 "num_base_bdevs_operational": 2, 00:30:01.124 "process": { 00:30:01.124 "type": "rebuild", 00:30:01.124 "target": "spare", 00:30:01.124 "progress": { 00:30:01.124 "blocks": 3072, 00:30:01.124 "percent": 38 00:30:01.124 } 00:30:01.124 }, 00:30:01.124 "base_bdevs_list": [ 00:30:01.124 { 00:30:01.124 "name": "spare", 00:30:01.124 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:01.124 "is_configured": true, 00:30:01.124 "data_offset": 256, 00:30:01.124 "data_size": 7936 00:30:01.124 }, 00:30:01.124 { 00:30:01.124 "name": "BaseBdev2", 00:30:01.124 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:01.124 "is_configured": true, 00:30:01.124 "data_offset": 256, 00:30:01.124 "data_size": 7936 00:30:01.124 } 00:30:01.124 ] 00:30:01.124 }' 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:30:01.124 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1108 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:01.124 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:01.388 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:01.388 "name": "raid_bdev1", 00:30:01.388 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:01.388 "strip_size_kb": 0, 00:30:01.388 "state": "online", 00:30:01.388 "raid_level": "raid1", 00:30:01.388 "superblock": true, 00:30:01.388 "num_base_bdevs": 2, 00:30:01.388 "num_base_bdevs_discovered": 2, 00:30:01.388 "num_base_bdevs_operational": 2, 00:30:01.388 "process": { 00:30:01.388 "type": "rebuild", 00:30:01.388 "target": "spare", 00:30:01.388 "progress": { 00:30:01.388 "blocks": 3840, 00:30:01.388 "percent": 48 00:30:01.388 } 00:30:01.388 }, 00:30:01.388 "base_bdevs_list": [ 00:30:01.388 { 00:30:01.388 "name": "spare", 00:30:01.388 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:01.388 "is_configured": true, 00:30:01.388 "data_offset": 256, 00:30:01.388 "data_size": 7936 00:30:01.388 }, 00:30:01.388 { 00:30:01.388 "name": "BaseBdev2", 00:30:01.388 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:01.388 "is_configured": true, 00:30:01.388 "data_offset": 256, 00:30:01.388 "data_size": 7936 00:30:01.388 } 00:30:01.388 ] 00:30:01.388 }' 00:30:01.388 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:01.388 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:01.388 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:01.388 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:01.388 12:11:14 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:02.768 12:11:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:02.768 12:11:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:02.768 12:11:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:02.768 12:11:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:02.768 12:11:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:02.768 12:11:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:02.768 12:11:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:02.768 12:11:15 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:02.768 12:11:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:02.769 "name": "raid_bdev1", 00:30:02.769 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:02.769 "strip_size_kb": 0, 00:30:02.769 "state": "online", 00:30:02.769 "raid_level": "raid1", 00:30:02.769 "superblock": true, 00:30:02.769 "num_base_bdevs": 2, 00:30:02.769 "num_base_bdevs_discovered": 2, 00:30:02.769 "num_base_bdevs_operational": 2, 00:30:02.769 "process": { 00:30:02.769 "type": "rebuild", 00:30:02.769 "target": "spare", 00:30:02.769 "progress": { 00:30:02.769 "blocks": 7424, 00:30:02.769 "percent": 93 00:30:02.769 } 00:30:02.769 }, 00:30:02.769 "base_bdevs_list": [ 00:30:02.769 { 00:30:02.769 "name": "spare", 00:30:02.769 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:02.769 "is_configured": true, 00:30:02.769 "data_offset": 256, 00:30:02.769 "data_size": 7936 00:30:02.769 }, 00:30:02.769 { 00:30:02.769 "name": "BaseBdev2", 00:30:02.769 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:02.769 "is_configured": true, 00:30:02.769 "data_offset": 256, 00:30:02.769 "data_size": 7936 00:30:02.769 } 00:30:02.769 ] 00:30:02.769 }' 00:30:02.769 12:11:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:02.769 12:11:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:02.769 12:11:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:02.769 12:11:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:02.769 12:11:16 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:30:03.029 [2024-07-15 12:11:16.388617] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:30:03.029 [2024-07-15 12:11:16.388672] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:30:03.029 [2024-07-15 12:11:16.388756] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:03.970 "name": "raid_bdev1", 00:30:03.970 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:03.970 "strip_size_kb": 0, 00:30:03.970 "state": "online", 00:30:03.970 "raid_level": "raid1", 00:30:03.970 "superblock": true, 00:30:03.970 "num_base_bdevs": 2, 00:30:03.970 "num_base_bdevs_discovered": 2, 00:30:03.970 "num_base_bdevs_operational": 2, 00:30:03.970 "base_bdevs_list": [ 00:30:03.970 { 00:30:03.970 "name": "spare", 00:30:03.970 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:03.970 "is_configured": true, 00:30:03.970 "data_offset": 256, 00:30:03.970 "data_size": 7936 00:30:03.970 }, 00:30:03.970 { 00:30:03.970 "name": "BaseBdev2", 00:30:03.970 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:03.970 "is_configured": true, 00:30:03.970 "data_offset": 256, 00:30:03.970 "data_size": 7936 00:30:03.970 } 00:30:03.970 ] 00:30:03.970 }' 00:30:03.970 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.230 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:04.490 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:04.490 "name": "raid_bdev1", 00:30:04.490 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:04.490 "strip_size_kb": 0, 00:30:04.490 "state": "online", 00:30:04.490 "raid_level": "raid1", 00:30:04.490 "superblock": true, 00:30:04.490 "num_base_bdevs": 2, 00:30:04.490 "num_base_bdevs_discovered": 2, 00:30:04.490 "num_base_bdevs_operational": 2, 00:30:04.490 "base_bdevs_list": [ 00:30:04.490 { 00:30:04.490 "name": "spare", 00:30:04.490 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:04.490 "is_configured": true, 00:30:04.490 "data_offset": 256, 00:30:04.490 "data_size": 7936 00:30:04.490 }, 00:30:04.490 { 00:30:04.490 "name": "BaseBdev2", 00:30:04.490 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:04.490 "is_configured": true, 00:30:04.490 "data_offset": 256, 00:30:04.490 "data_size": 7936 00:30:04.490 } 00:30:04.490 ] 00:30:04.490 }' 00:30:04.490 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:04.490 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:04.490 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:04.490 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:04.490 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:04.490 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:04.490 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:04.490 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:04.491 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:04.491 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:04.491 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:04.491 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:04.491 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:04.491 12:11:17 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:04.491 12:11:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:04.491 12:11:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:04.751 12:11:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:04.751 "name": "raid_bdev1", 00:30:04.751 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:04.751 "strip_size_kb": 0, 00:30:04.751 "state": "online", 00:30:04.751 "raid_level": "raid1", 00:30:04.751 "superblock": true, 00:30:04.751 "num_base_bdevs": 2, 00:30:04.751 "num_base_bdevs_discovered": 2, 00:30:04.751 "num_base_bdevs_operational": 2, 00:30:04.751 "base_bdevs_list": [ 00:30:04.751 { 00:30:04.751 "name": "spare", 00:30:04.751 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:04.751 "is_configured": true, 00:30:04.751 "data_offset": 256, 00:30:04.751 "data_size": 7936 00:30:04.751 }, 00:30:04.751 { 00:30:04.751 "name": "BaseBdev2", 00:30:04.751 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:04.751 "is_configured": true, 00:30:04.751 "data_offset": 256, 00:30:04.751 "data_size": 7936 00:30:04.751 } 00:30:04.751 ] 00:30:04.751 }' 00:30:04.751 12:11:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:04.751 12:11:18 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:05.321 12:11:18 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:05.580 [2024-07-15 12:11:19.019104] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:05.580 [2024-07-15 12:11:19.019132] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:05.580 [2024-07-15 12:11:19.019191] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:05.580 [2024-07-15 12:11:19.019249] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:05.580 [2024-07-15 12:11:19.019261] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a6340 name raid_bdev1, state offline 00:30:05.580 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:05.580 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:05.840 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:30:06.411 /dev/nbd0 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:06.411 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:06.411 1+0 records in 00:30:06.411 1+0 records out 00:30:06.412 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249686 s, 16.4 MB/s 00:30:06.412 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.412 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:30:06.412 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.412 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:06.412 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:30:06.412 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:06.412 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:06.412 12:11:19 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:30:06.672 /dev/nbd1 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local i 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # break 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:06.672 1+0 records in 00:30:06.672 1+0 records out 00:30:06.672 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000333707 s, 12.3 MB/s 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # size=4096 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # return 0 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:06.672 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:06.932 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:30:07.192 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:07.452 12:11:20 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:07.711 [2024-07-15 12:11:21.181112] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:07.711 [2024-07-15 12:11:21.181156] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:07.711 [2024-07-15 12:11:21.181177] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14a6d50 00:30:07.711 [2024-07-15 12:11:21.181195] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:07.711 [2024-07-15 12:11:21.182651] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:07.711 [2024-07-15 12:11:21.182677] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:07.711 [2024-07-15 12:11:21.182740] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:07.711 [2024-07-15 12:11:21.182765] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:07.711 [2024-07-15 12:11:21.182858] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:07.711 spare 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:07.711 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:07.711 [2024-07-15 12:11:21.283163] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14a5180 00:30:07.711 [2024-07-15 12:11:21.283177] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:30:07.711 [2024-07-15 12:11:21.283244] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14a9f40 00:30:07.711 [2024-07-15 12:11:21.283355] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14a5180 00:30:07.711 [2024-07-15 12:11:21.283364] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14a5180 00:30:07.711 [2024-07-15 12:11:21.283437] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:07.969 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:07.969 "name": "raid_bdev1", 00:30:07.969 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:07.969 "strip_size_kb": 0, 00:30:07.969 "state": "online", 00:30:07.969 "raid_level": "raid1", 00:30:07.969 "superblock": true, 00:30:07.969 "num_base_bdevs": 2, 00:30:07.969 "num_base_bdevs_discovered": 2, 00:30:07.970 "num_base_bdevs_operational": 2, 00:30:07.970 "base_bdevs_list": [ 00:30:07.970 { 00:30:07.970 "name": "spare", 00:30:07.970 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:07.970 "is_configured": true, 00:30:07.970 "data_offset": 256, 00:30:07.970 "data_size": 7936 00:30:07.970 }, 00:30:07.970 { 00:30:07.970 "name": "BaseBdev2", 00:30:07.970 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:07.970 "is_configured": true, 00:30:07.970 "data_offset": 256, 00:30:07.970 "data_size": 7936 00:30:07.970 } 00:30:07.970 ] 00:30:07.970 }' 00:30:07.970 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:07.970 12:11:21 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:08.537 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:08.537 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:08.537 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:08.537 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:08.537 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:08.537 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:08.537 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:08.795 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:08.795 "name": "raid_bdev1", 00:30:08.795 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:08.795 "strip_size_kb": 0, 00:30:08.795 "state": "online", 00:30:08.795 "raid_level": "raid1", 00:30:08.795 "superblock": true, 00:30:08.795 "num_base_bdevs": 2, 00:30:08.795 "num_base_bdevs_discovered": 2, 00:30:08.795 "num_base_bdevs_operational": 2, 00:30:08.795 "base_bdevs_list": [ 00:30:08.795 { 00:30:08.795 "name": "spare", 00:30:08.795 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:08.795 "is_configured": true, 00:30:08.795 "data_offset": 256, 00:30:08.795 "data_size": 7936 00:30:08.795 }, 00:30:08.795 { 00:30:08.795 "name": "BaseBdev2", 00:30:08.795 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:08.795 "is_configured": true, 00:30:08.795 "data_offset": 256, 00:30:08.795 "data_size": 7936 00:30:08.795 } 00:30:08.795 ] 00:30:08.795 }' 00:30:08.795 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:08.795 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:08.795 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:08.795 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:08.795 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:08.795 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:30:09.054 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:30:09.054 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:30:09.313 [2024-07-15 12:11:22.801533] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:09.313 12:11:22 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:09.573 12:11:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:09.573 "name": "raid_bdev1", 00:30:09.573 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:09.573 "strip_size_kb": 0, 00:30:09.573 "state": "online", 00:30:09.573 "raid_level": "raid1", 00:30:09.573 "superblock": true, 00:30:09.573 "num_base_bdevs": 2, 00:30:09.573 "num_base_bdevs_discovered": 1, 00:30:09.573 "num_base_bdevs_operational": 1, 00:30:09.573 "base_bdevs_list": [ 00:30:09.573 { 00:30:09.573 "name": null, 00:30:09.573 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:09.573 "is_configured": false, 00:30:09.573 "data_offset": 256, 00:30:09.573 "data_size": 7936 00:30:09.573 }, 00:30:09.573 { 00:30:09.573 "name": "BaseBdev2", 00:30:09.573 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:09.573 "is_configured": true, 00:30:09.573 "data_offset": 256, 00:30:09.573 "data_size": 7936 00:30:09.573 } 00:30:09.573 ] 00:30:09.573 }' 00:30:09.573 12:11:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:09.573 12:11:23 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:10.151 12:11:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:10.410 [2024-07-15 12:11:23.924524] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:10.410 [2024-07-15 12:11:23.924672] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:10.410 [2024-07-15 12:11:23.924694] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:10.410 [2024-07-15 12:11:23.924722] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:10.410 [2024-07-15 12:11:23.926889] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x130e360 00:30:10.410 [2024-07-15 12:11:23.929202] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:10.410 12:11:23 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:30:11.789 12:11:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:11.789 12:11:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:11.789 12:11:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:11.789 12:11:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:11.789 12:11:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:11.789 12:11:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:11.789 12:11:24 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:11.789 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:11.789 "name": "raid_bdev1", 00:30:11.789 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:11.789 "strip_size_kb": 0, 00:30:11.789 "state": "online", 00:30:11.789 "raid_level": "raid1", 00:30:11.789 "superblock": true, 00:30:11.789 "num_base_bdevs": 2, 00:30:11.789 "num_base_bdevs_discovered": 2, 00:30:11.789 "num_base_bdevs_operational": 2, 00:30:11.789 "process": { 00:30:11.789 "type": "rebuild", 00:30:11.789 "target": "spare", 00:30:11.789 "progress": { 00:30:11.789 "blocks": 3072, 00:30:11.789 "percent": 38 00:30:11.789 } 00:30:11.789 }, 00:30:11.789 "base_bdevs_list": [ 00:30:11.789 { 00:30:11.789 "name": "spare", 00:30:11.789 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:11.789 "is_configured": true, 00:30:11.789 "data_offset": 256, 00:30:11.789 "data_size": 7936 00:30:11.789 }, 00:30:11.789 { 00:30:11.789 "name": "BaseBdev2", 00:30:11.789 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:11.789 "is_configured": true, 00:30:11.789 "data_offset": 256, 00:30:11.789 "data_size": 7936 00:30:11.789 } 00:30:11.789 ] 00:30:11.789 }' 00:30:11.789 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:11.789 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:11.789 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:11.789 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:11.789 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:12.049 [2024-07-15 12:11:25.518660] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:12.049 [2024-07-15 12:11:25.541950] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:12.049 [2024-07-15 12:11:25.541992] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:12.049 [2024-07-15 12:11:25.542006] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:12.049 [2024-07-15 12:11:25.542014] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:12.049 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:12.309 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:12.309 "name": "raid_bdev1", 00:30:12.309 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:12.309 "strip_size_kb": 0, 00:30:12.309 "state": "online", 00:30:12.309 "raid_level": "raid1", 00:30:12.309 "superblock": true, 00:30:12.309 "num_base_bdevs": 2, 00:30:12.309 "num_base_bdevs_discovered": 1, 00:30:12.309 "num_base_bdevs_operational": 1, 00:30:12.309 "base_bdevs_list": [ 00:30:12.309 { 00:30:12.309 "name": null, 00:30:12.309 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:12.309 "is_configured": false, 00:30:12.309 "data_offset": 256, 00:30:12.309 "data_size": 7936 00:30:12.309 }, 00:30:12.309 { 00:30:12.309 "name": "BaseBdev2", 00:30:12.309 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:12.309 "is_configured": true, 00:30:12.309 "data_offset": 256, 00:30:12.309 "data_size": 7936 00:30:12.309 } 00:30:12.309 ] 00:30:12.309 }' 00:30:12.309 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:12.309 12:11:25 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:12.877 12:11:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:13.137 [2024-07-15 12:11:26.647904] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:13.137 [2024-07-15 12:11:26.647953] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:13.137 [2024-07-15 12:11:26.647976] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1494a40 00:30:13.137 [2024-07-15 12:11:26.647990] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:13.137 [2024-07-15 12:11:26.648206] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:13.137 [2024-07-15 12:11:26.648222] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:13.137 [2024-07-15 12:11:26.648280] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:30:13.137 [2024-07-15 12:11:26.648291] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:30:13.137 [2024-07-15 12:11:26.648302] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:30:13.137 [2024-07-15 12:11:26.648325] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:13.137 [2024-07-15 12:11:26.650502] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14948b0 00:30:13.137 [2024-07-15 12:11:26.651950] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:13.137 spare 00:30:13.137 12:11:26 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:30:14.516 12:11:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:14.516 12:11:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:14.516 12:11:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:14.516 12:11:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:14.516 12:11:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:14.516 12:11:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:14.516 12:11:27 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:14.776 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:14.776 "name": "raid_bdev1", 00:30:14.776 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:14.776 "strip_size_kb": 0, 00:30:14.776 "state": "online", 00:30:14.776 "raid_level": "raid1", 00:30:14.776 "superblock": true, 00:30:14.776 "num_base_bdevs": 2, 00:30:14.776 "num_base_bdevs_discovered": 2, 00:30:14.776 "num_base_bdevs_operational": 2, 00:30:14.776 "process": { 00:30:14.776 "type": "rebuild", 00:30:14.776 "target": "spare", 00:30:14.776 "progress": { 00:30:14.776 "blocks": 3840, 00:30:14.776 "percent": 48 00:30:14.776 } 00:30:14.776 }, 00:30:14.776 "base_bdevs_list": [ 00:30:14.776 { 00:30:14.776 "name": "spare", 00:30:14.776 "uuid": "b5b5afe6-3668-57a4-a617-5d25edd2fed2", 00:30:14.776 "is_configured": true, 00:30:14.776 "data_offset": 256, 00:30:14.776 "data_size": 7936 00:30:14.776 }, 00:30:14.776 { 00:30:14.776 "name": "BaseBdev2", 00:30:14.776 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:14.776 "is_configured": true, 00:30:14.776 "data_offset": 256, 00:30:14.776 "data_size": 7936 00:30:14.776 } 00:30:14.776 ] 00:30:14.776 }' 00:30:14.776 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:14.776 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:30:14.776 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:14.776 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:30:14.776 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:30:15.036 [2024-07-15 12:11:28.518083] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:15.036 [2024-07-15 12:11:28.567197] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:30:15.036 [2024-07-15 12:11:28.567238] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:15.036 [2024-07-15 12:11:28.567252] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:30:15.036 [2024-07-15 12:11:28.567261] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:15.036 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:15.295 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:15.295 "name": "raid_bdev1", 00:30:15.295 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:15.295 "strip_size_kb": 0, 00:30:15.295 "state": "online", 00:30:15.295 "raid_level": "raid1", 00:30:15.295 "superblock": true, 00:30:15.295 "num_base_bdevs": 2, 00:30:15.295 "num_base_bdevs_discovered": 1, 00:30:15.295 "num_base_bdevs_operational": 1, 00:30:15.295 "base_bdevs_list": [ 00:30:15.295 { 00:30:15.295 "name": null, 00:30:15.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:15.295 "is_configured": false, 00:30:15.295 "data_offset": 256, 00:30:15.295 "data_size": 7936 00:30:15.295 }, 00:30:15.295 { 00:30:15.295 "name": "BaseBdev2", 00:30:15.295 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:15.295 "is_configured": true, 00:30:15.295 "data_offset": 256, 00:30:15.295 "data_size": 7936 00:30:15.295 } 00:30:15.295 ] 00:30:15.295 }' 00:30:15.295 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:15.295 12:11:28 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:16.234 "name": "raid_bdev1", 00:30:16.234 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:16.234 "strip_size_kb": 0, 00:30:16.234 "state": "online", 00:30:16.234 "raid_level": "raid1", 00:30:16.234 "superblock": true, 00:30:16.234 "num_base_bdevs": 2, 00:30:16.234 "num_base_bdevs_discovered": 1, 00:30:16.234 "num_base_bdevs_operational": 1, 00:30:16.234 "base_bdevs_list": [ 00:30:16.234 { 00:30:16.234 "name": null, 00:30:16.234 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:16.234 "is_configured": false, 00:30:16.234 "data_offset": 256, 00:30:16.234 "data_size": 7936 00:30:16.234 }, 00:30:16.234 { 00:30:16.234 "name": "BaseBdev2", 00:30:16.234 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:16.234 "is_configured": true, 00:30:16.234 "data_offset": 256, 00:30:16.234 "data_size": 7936 00:30:16.234 } 00:30:16.234 ] 00:30:16.234 }' 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:16.234 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:16.492 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:16.492 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:16.492 12:11:29 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:30:16.750 12:11:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:17.009 [2024-07-15 12:11:30.399653] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:17.009 [2024-07-15 12:11:30.399712] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:17.009 [2024-07-15 12:11:30.399735] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14a7ef0 00:30:17.009 [2024-07-15 12:11:30.399748] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:17.009 [2024-07-15 12:11:30.399940] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:17.009 [2024-07-15 12:11:30.399956] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:17.009 [2024-07-15 12:11:30.400001] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:30:17.009 [2024-07-15 12:11:30.400012] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:17.009 [2024-07-15 12:11:30.400023] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:17.009 BaseBdev1 00:30:17.009 12:11:30 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:17.946 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:18.205 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:18.205 "name": "raid_bdev1", 00:30:18.205 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:18.205 "strip_size_kb": 0, 00:30:18.205 "state": "online", 00:30:18.205 "raid_level": "raid1", 00:30:18.205 "superblock": true, 00:30:18.205 "num_base_bdevs": 2, 00:30:18.205 "num_base_bdevs_discovered": 1, 00:30:18.205 "num_base_bdevs_operational": 1, 00:30:18.205 "base_bdevs_list": [ 00:30:18.205 { 00:30:18.205 "name": null, 00:30:18.205 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:18.205 "is_configured": false, 00:30:18.205 "data_offset": 256, 00:30:18.205 "data_size": 7936 00:30:18.205 }, 00:30:18.205 { 00:30:18.205 "name": "BaseBdev2", 00:30:18.205 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:18.205 "is_configured": true, 00:30:18.205 "data_offset": 256, 00:30:18.205 "data_size": 7936 00:30:18.205 } 00:30:18.205 ] 00:30:18.205 }' 00:30:18.205 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:18.205 12:11:31 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:18.772 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:18.772 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:18.772 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:18.772 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:18.772 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:18.772 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:18.772 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:19.031 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:19.031 "name": "raid_bdev1", 00:30:19.031 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:19.031 "strip_size_kb": 0, 00:30:19.031 "state": "online", 00:30:19.031 "raid_level": "raid1", 00:30:19.031 "superblock": true, 00:30:19.031 "num_base_bdevs": 2, 00:30:19.032 "num_base_bdevs_discovered": 1, 00:30:19.032 "num_base_bdevs_operational": 1, 00:30:19.032 "base_bdevs_list": [ 00:30:19.032 { 00:30:19.032 "name": null, 00:30:19.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:19.032 "is_configured": false, 00:30:19.032 "data_offset": 256, 00:30:19.032 "data_size": 7936 00:30:19.032 }, 00:30:19.032 { 00:30:19.032 "name": "BaseBdev2", 00:30:19.032 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:19.032 "is_configured": true, 00:30:19.032 "data_offset": 256, 00:30:19.032 "data_size": 7936 00:30:19.032 } 00:30:19.032 ] 00:30:19.032 }' 00:30:19.032 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:19.032 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:19.032 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@648 -- # local es=0 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:30:19.291 [2024-07-15 12:11:32.866421] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:19.291 [2024-07-15 12:11:32.866541] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:30:19.291 [2024-07-15 12:11:32.866556] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:30:19.291 request: 00:30:19.291 { 00:30:19.291 "base_bdev": "BaseBdev1", 00:30:19.291 "raid_bdev": "raid_bdev1", 00:30:19.291 "method": "bdev_raid_add_base_bdev", 00:30:19.291 "req_id": 1 00:30:19.291 } 00:30:19.291 Got JSON-RPC error response 00:30:19.291 response: 00:30:19.291 { 00:30:19.291 "code": -22, 00:30:19.291 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:30:19.291 } 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # es=1 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:30:19.291 12:11:32 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:20.706 12:11:33 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:20.706 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:20.706 "name": "raid_bdev1", 00:30:20.706 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:20.706 "strip_size_kb": 0, 00:30:20.706 "state": "online", 00:30:20.706 "raid_level": "raid1", 00:30:20.706 "superblock": true, 00:30:20.706 "num_base_bdevs": 2, 00:30:20.706 "num_base_bdevs_discovered": 1, 00:30:20.706 "num_base_bdevs_operational": 1, 00:30:20.706 "base_bdevs_list": [ 00:30:20.706 { 00:30:20.706 "name": null, 00:30:20.706 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:20.706 "is_configured": false, 00:30:20.706 "data_offset": 256, 00:30:20.706 "data_size": 7936 00:30:20.706 }, 00:30:20.706 { 00:30:20.706 "name": "BaseBdev2", 00:30:20.706 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:20.706 "is_configured": true, 00:30:20.706 "data_offset": 256, 00:30:20.706 "data_size": 7936 00:30:20.706 } 00:30:20.706 ] 00:30:20.706 }' 00:30:20.706 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:20.706 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:21.275 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:30:21.275 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:21.275 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:30:21.275 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:30:21.275 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:21.275 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:21.275 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:21.535 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:21.535 "name": "raid_bdev1", 00:30:21.535 "uuid": "3f4a5c0f-7a03-4673-a25a-0008fce3c835", 00:30:21.535 "strip_size_kb": 0, 00:30:21.535 "state": "online", 00:30:21.535 "raid_level": "raid1", 00:30:21.535 "superblock": true, 00:30:21.535 "num_base_bdevs": 2, 00:30:21.535 "num_base_bdevs_discovered": 1, 00:30:21.535 "num_base_bdevs_operational": 1, 00:30:21.535 "base_bdevs_list": [ 00:30:21.535 { 00:30:21.535 "name": null, 00:30:21.535 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:21.535 "is_configured": false, 00:30:21.535 "data_offset": 256, 00:30:21.535 "data_size": 7936 00:30:21.535 }, 00:30:21.535 { 00:30:21.535 "name": "BaseBdev2", 00:30:21.535 "uuid": "acbeabb6-22ec-566a-a35b-e62805b16d6c", 00:30:21.535 "is_configured": true, 00:30:21.535 "data_offset": 256, 00:30:21.535 "data_size": 7936 00:30:21.535 } 00:30:21.535 ] 00:30:21.535 }' 00:30:21.535 12:11:34 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 1609881 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@948 -- # '[' -z 1609881 ']' 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@952 -- # kill -0 1609881 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # uname 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1609881 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1609881' 00:30:21.535 killing process with pid 1609881 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # kill 1609881 00:30:21.535 Received shutdown signal, test time was about 60.000000 seconds 00:30:21.535 00:30:21.535 Latency(us) 00:30:21.535 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:21.535 =================================================================================================================== 00:30:21.535 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:30:21.535 [2024-07-15 12:11:35.112928] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:21.535 [2024-07-15 12:11:35.113022] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:21.535 [2024-07-15 12:11:35.113071] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:21.535 [2024-07-15 12:11:35.113083] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14a5180 name raid_bdev1, state offline 00:30:21.535 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@972 -- # wait 1609881 00:30:21.795 [2024-07-15 12:11:35.146741] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:21.795 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:30:21.795 00:30:21.795 real 0m33.538s 00:30:21.795 user 0m53.324s 00:30:21.795 sys 0m5.448s 00:30:21.795 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:21.795 12:11:35 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:30:21.795 ************************************ 00:30:21.795 END TEST raid_rebuild_test_sb_md_separate 00:30:21.796 ************************************ 00:30:22.056 12:11:35 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:22.056 12:11:35 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:30:22.056 12:11:35 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:30:22.056 12:11:35 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:30:22.056 12:11:35 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:22.056 12:11:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:22.056 ************************************ 00:30:22.056 START TEST raid_state_function_test_sb_md_interleaved 00:30:22.056 ************************************ 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_state_function_test raid1 2 true 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1614558 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1614558' 00:30:22.056 Process raid pid: 1614558 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1614558 /var/tmp/spdk-raid.sock 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1614558 ']' 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:22.056 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:22.057 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:22.057 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:22.057 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:22.057 12:11:35 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:22.057 [2024-07-15 12:11:35.526768] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:30:22.057 [2024-07-15 12:11:35.526833] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:22.057 [2024-07-15 12:11:35.643551] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:22.317 [2024-07-15 12:11:35.740982] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:22.317 [2024-07-15 12:11:35.802063] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:22.317 [2024-07-15 12:11:35.802098] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:22.886 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:22.886 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:30:22.886 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:30:23.146 [2024-07-15 12:11:36.678523] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:23.146 [2024-07-15 12:11:36.678604] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:23.146 [2024-07-15 12:11:36.678628] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:23.146 [2024-07-15 12:11:36.678654] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:23.146 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:23.147 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:23.407 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:23.407 "name": "Existed_Raid", 00:30:23.407 "uuid": "09d9c8db-8838-4985-8d49-175e8b4ddd0e", 00:30:23.407 "strip_size_kb": 0, 00:30:23.407 "state": "configuring", 00:30:23.407 "raid_level": "raid1", 00:30:23.407 "superblock": true, 00:30:23.407 "num_base_bdevs": 2, 00:30:23.407 "num_base_bdevs_discovered": 0, 00:30:23.407 "num_base_bdevs_operational": 2, 00:30:23.407 "base_bdevs_list": [ 00:30:23.407 { 00:30:23.407 "name": "BaseBdev1", 00:30:23.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:23.407 "is_configured": false, 00:30:23.407 "data_offset": 0, 00:30:23.407 "data_size": 0 00:30:23.407 }, 00:30:23.407 { 00:30:23.407 "name": "BaseBdev2", 00:30:23.407 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:23.407 "is_configured": false, 00:30:23.407 "data_offset": 0, 00:30:23.407 "data_size": 0 00:30:23.407 } 00:30:23.407 ] 00:30:23.407 }' 00:30:23.407 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:23.407 12:11:36 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:24.346 12:11:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:24.346 [2024-07-15 12:11:37.841556] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:24.346 [2024-07-15 12:11:37.841608] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116cb00 name Existed_Raid, state configuring 00:30:24.346 12:11:37 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:30:24.606 [2024-07-15 12:11:38.086333] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:30:24.606 [2024-07-15 12:11:38.086388] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:30:24.606 [2024-07-15 12:11:38.086409] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:24.606 [2024-07-15 12:11:38.086435] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:24.606 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:30:24.866 [2024-07-15 12:11:38.342160] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:24.866 BaseBdev1 00:30:24.866 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:30:24.866 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev1 00:30:24.866 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:24.866 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:30:24.866 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:24.866 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:24.866 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:25.126 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:30:25.386 [ 00:30:25.386 { 00:30:25.386 "name": "BaseBdev1", 00:30:25.386 "aliases": [ 00:30:25.386 "700c5f32-0fdb-4b96-b37a-8a44f56d49e3" 00:30:25.386 ], 00:30:25.386 "product_name": "Malloc disk", 00:30:25.386 "block_size": 4128, 00:30:25.386 "num_blocks": 8192, 00:30:25.386 "uuid": "700c5f32-0fdb-4b96-b37a-8a44f56d49e3", 00:30:25.386 "md_size": 32, 00:30:25.386 "md_interleave": true, 00:30:25.386 "dif_type": 0, 00:30:25.386 "assigned_rate_limits": { 00:30:25.386 "rw_ios_per_sec": 0, 00:30:25.386 "rw_mbytes_per_sec": 0, 00:30:25.386 "r_mbytes_per_sec": 0, 00:30:25.386 "w_mbytes_per_sec": 0 00:30:25.386 }, 00:30:25.386 "claimed": true, 00:30:25.386 "claim_type": "exclusive_write", 00:30:25.386 "zoned": false, 00:30:25.386 "supported_io_types": { 00:30:25.386 "read": true, 00:30:25.386 "write": true, 00:30:25.386 "unmap": true, 00:30:25.386 "flush": true, 00:30:25.386 "reset": true, 00:30:25.386 "nvme_admin": false, 00:30:25.386 "nvme_io": false, 00:30:25.387 "nvme_io_md": false, 00:30:25.387 "write_zeroes": true, 00:30:25.387 "zcopy": true, 00:30:25.387 "get_zone_info": false, 00:30:25.387 "zone_management": false, 00:30:25.387 "zone_append": false, 00:30:25.387 "compare": false, 00:30:25.387 "compare_and_write": false, 00:30:25.387 "abort": true, 00:30:25.387 "seek_hole": false, 00:30:25.387 "seek_data": false, 00:30:25.387 "copy": true, 00:30:25.387 "nvme_iov_md": false 00:30:25.387 }, 00:30:25.387 "memory_domains": [ 00:30:25.387 { 00:30:25.387 "dma_device_id": "system", 00:30:25.387 "dma_device_type": 1 00:30:25.387 }, 00:30:25.387 { 00:30:25.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:25.387 "dma_device_type": 2 00:30:25.387 } 00:30:25.387 ], 00:30:25.387 "driver_specific": {} 00:30:25.387 } 00:30:25.387 ] 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:25.387 12:11:38 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:25.647 12:11:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:25.647 "name": "Existed_Raid", 00:30:25.647 "uuid": "81fc97ef-af66-49bc-889d-40543aca6dd6", 00:30:25.647 "strip_size_kb": 0, 00:30:25.647 "state": "configuring", 00:30:25.647 "raid_level": "raid1", 00:30:25.647 "superblock": true, 00:30:25.647 "num_base_bdevs": 2, 00:30:25.647 "num_base_bdevs_discovered": 1, 00:30:25.647 "num_base_bdevs_operational": 2, 00:30:25.647 "base_bdevs_list": [ 00:30:25.647 { 00:30:25.647 "name": "BaseBdev1", 00:30:25.647 "uuid": "700c5f32-0fdb-4b96-b37a-8a44f56d49e3", 00:30:25.647 "is_configured": true, 00:30:25.647 "data_offset": 256, 00:30:25.647 "data_size": 7936 00:30:25.647 }, 00:30:25.647 { 00:30:25.647 "name": "BaseBdev2", 00:30:25.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:25.647 "is_configured": false, 00:30:25.647 "data_offset": 0, 00:30:25.647 "data_size": 0 00:30:25.647 } 00:30:25.647 ] 00:30:25.647 }' 00:30:25.647 12:11:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:25.647 12:11:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:26.587 12:11:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:30:26.587 [2024-07-15 12:11:39.958956] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:30:26.587 [2024-07-15 12:11:39.959024] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116c3d0 name Existed_Raid, state configuring 00:30:26.587 12:11:39 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:30:26.587 [2024-07-15 12:11:40.155629] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:26.587 [2024-07-15 12:11:40.158818] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:30:26.587 [2024-07-15 12:11:40.158885] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:26.846 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:27.104 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:27.104 "name": "Existed_Raid", 00:30:27.104 "uuid": "43a9b142-fbb2-4c79-abe8-82273c85249c", 00:30:27.104 "strip_size_kb": 0, 00:30:27.104 "state": "configuring", 00:30:27.104 "raid_level": "raid1", 00:30:27.105 "superblock": true, 00:30:27.105 "num_base_bdevs": 2, 00:30:27.105 "num_base_bdevs_discovered": 1, 00:30:27.105 "num_base_bdevs_operational": 2, 00:30:27.105 "base_bdevs_list": [ 00:30:27.105 { 00:30:27.105 "name": "BaseBdev1", 00:30:27.105 "uuid": "700c5f32-0fdb-4b96-b37a-8a44f56d49e3", 00:30:27.105 "is_configured": true, 00:30:27.105 "data_offset": 256, 00:30:27.105 "data_size": 7936 00:30:27.105 }, 00:30:27.105 { 00:30:27.105 "name": "BaseBdev2", 00:30:27.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:27.105 "is_configured": false, 00:30:27.105 "data_offset": 0, 00:30:27.105 "data_size": 0 00:30:27.105 } 00:30:27.105 ] 00:30:27.105 }' 00:30:27.105 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:27.105 12:11:40 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:27.671 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:30:27.671 [2024-07-15 12:11:41.203028] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:27.671 [2024-07-15 12:11:41.203301] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x116bb60 00:30:27.671 [2024-07-15 12:11:41.203330] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:27.671 [2024-07-15 12:11:41.203451] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x116e2e0 00:30:27.671 [2024-07-15 12:11:41.203601] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x116bb60 00:30:27.671 [2024-07-15 12:11:41.203623] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x116bb60 00:30:27.671 [2024-07-15 12:11:41.203778] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:27.671 BaseBdev2 00:30:27.671 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:30:27.671 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@897 -- # local bdev_name=BaseBdev2 00:30:27.671 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:30:27.671 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local i 00:30:27.671 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:30:27.671 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:30:27.671 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:30:27.937 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:30:28.200 [ 00:30:28.200 { 00:30:28.200 "name": "BaseBdev2", 00:30:28.200 "aliases": [ 00:30:28.200 "4dc12a9b-93fa-413b-8629-0905b3911991" 00:30:28.200 ], 00:30:28.200 "product_name": "Malloc disk", 00:30:28.200 "block_size": 4128, 00:30:28.200 "num_blocks": 8192, 00:30:28.200 "uuid": "4dc12a9b-93fa-413b-8629-0905b3911991", 00:30:28.200 "md_size": 32, 00:30:28.200 "md_interleave": true, 00:30:28.200 "dif_type": 0, 00:30:28.200 "assigned_rate_limits": { 00:30:28.200 "rw_ios_per_sec": 0, 00:30:28.200 "rw_mbytes_per_sec": 0, 00:30:28.200 "r_mbytes_per_sec": 0, 00:30:28.200 "w_mbytes_per_sec": 0 00:30:28.200 }, 00:30:28.200 "claimed": true, 00:30:28.200 "claim_type": "exclusive_write", 00:30:28.200 "zoned": false, 00:30:28.200 "supported_io_types": { 00:30:28.200 "read": true, 00:30:28.200 "write": true, 00:30:28.200 "unmap": true, 00:30:28.200 "flush": true, 00:30:28.200 "reset": true, 00:30:28.200 "nvme_admin": false, 00:30:28.200 "nvme_io": false, 00:30:28.200 "nvme_io_md": false, 00:30:28.200 "write_zeroes": true, 00:30:28.200 "zcopy": true, 00:30:28.200 "get_zone_info": false, 00:30:28.200 "zone_management": false, 00:30:28.200 "zone_append": false, 00:30:28.200 "compare": false, 00:30:28.200 "compare_and_write": false, 00:30:28.200 "abort": true, 00:30:28.200 "seek_hole": false, 00:30:28.200 "seek_data": false, 00:30:28.200 "copy": true, 00:30:28.200 "nvme_iov_md": false 00:30:28.200 }, 00:30:28.200 "memory_domains": [ 00:30:28.200 { 00:30:28.200 "dma_device_id": "system", 00:30:28.200 "dma_device_type": 1 00:30:28.200 }, 00:30:28.200 { 00:30:28.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:28.200 "dma_device_type": 2 00:30:28.200 } 00:30:28.200 ], 00:30:28.200 "driver_specific": {} 00:30:28.200 } 00:30:28.200 ] 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # return 0 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:28.200 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:28.201 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:28.201 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:28.201 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:28.201 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:28.201 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:28.458 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:28.458 "name": "Existed_Raid", 00:30:28.458 "uuid": "43a9b142-fbb2-4c79-abe8-82273c85249c", 00:30:28.458 "strip_size_kb": 0, 00:30:28.458 "state": "online", 00:30:28.458 "raid_level": "raid1", 00:30:28.458 "superblock": true, 00:30:28.458 "num_base_bdevs": 2, 00:30:28.458 "num_base_bdevs_discovered": 2, 00:30:28.458 "num_base_bdevs_operational": 2, 00:30:28.458 "base_bdevs_list": [ 00:30:28.458 { 00:30:28.458 "name": "BaseBdev1", 00:30:28.458 "uuid": "700c5f32-0fdb-4b96-b37a-8a44f56d49e3", 00:30:28.458 "is_configured": true, 00:30:28.458 "data_offset": 256, 00:30:28.458 "data_size": 7936 00:30:28.458 }, 00:30:28.458 { 00:30:28.458 "name": "BaseBdev2", 00:30:28.458 "uuid": "4dc12a9b-93fa-413b-8629-0905b3911991", 00:30:28.458 "is_configured": true, 00:30:28.458 "data_offset": 256, 00:30:28.458 "data_size": 7936 00:30:28.458 } 00:30:28.458 ] 00:30:28.458 }' 00:30:28.458 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:28.458 12:11:41 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:29.025 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:30:29.025 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:30:29.025 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:29.025 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:29.025 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:29.025 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:29.025 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:30:29.025 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:29.284 [2024-07-15 12:11:42.647950] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:29.284 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:29.284 "name": "Existed_Raid", 00:30:29.284 "aliases": [ 00:30:29.284 "43a9b142-fbb2-4c79-abe8-82273c85249c" 00:30:29.284 ], 00:30:29.284 "product_name": "Raid Volume", 00:30:29.284 "block_size": 4128, 00:30:29.284 "num_blocks": 7936, 00:30:29.284 "uuid": "43a9b142-fbb2-4c79-abe8-82273c85249c", 00:30:29.284 "md_size": 32, 00:30:29.284 "md_interleave": true, 00:30:29.284 "dif_type": 0, 00:30:29.284 "assigned_rate_limits": { 00:30:29.284 "rw_ios_per_sec": 0, 00:30:29.284 "rw_mbytes_per_sec": 0, 00:30:29.284 "r_mbytes_per_sec": 0, 00:30:29.284 "w_mbytes_per_sec": 0 00:30:29.284 }, 00:30:29.284 "claimed": false, 00:30:29.284 "zoned": false, 00:30:29.284 "supported_io_types": { 00:30:29.284 "read": true, 00:30:29.284 "write": true, 00:30:29.284 "unmap": false, 00:30:29.284 "flush": false, 00:30:29.284 "reset": true, 00:30:29.284 "nvme_admin": false, 00:30:29.284 "nvme_io": false, 00:30:29.284 "nvme_io_md": false, 00:30:29.284 "write_zeroes": true, 00:30:29.284 "zcopy": false, 00:30:29.284 "get_zone_info": false, 00:30:29.284 "zone_management": false, 00:30:29.284 "zone_append": false, 00:30:29.284 "compare": false, 00:30:29.284 "compare_and_write": false, 00:30:29.284 "abort": false, 00:30:29.284 "seek_hole": false, 00:30:29.284 "seek_data": false, 00:30:29.284 "copy": false, 00:30:29.284 "nvme_iov_md": false 00:30:29.284 }, 00:30:29.284 "memory_domains": [ 00:30:29.284 { 00:30:29.284 "dma_device_id": "system", 00:30:29.284 "dma_device_type": 1 00:30:29.284 }, 00:30:29.284 { 00:30:29.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:29.284 "dma_device_type": 2 00:30:29.284 }, 00:30:29.284 { 00:30:29.284 "dma_device_id": "system", 00:30:29.284 "dma_device_type": 1 00:30:29.284 }, 00:30:29.284 { 00:30:29.284 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:29.284 "dma_device_type": 2 00:30:29.284 } 00:30:29.284 ], 00:30:29.284 "driver_specific": { 00:30:29.284 "raid": { 00:30:29.284 "uuid": "43a9b142-fbb2-4c79-abe8-82273c85249c", 00:30:29.284 "strip_size_kb": 0, 00:30:29.284 "state": "online", 00:30:29.284 "raid_level": "raid1", 00:30:29.284 "superblock": true, 00:30:29.284 "num_base_bdevs": 2, 00:30:29.284 "num_base_bdevs_discovered": 2, 00:30:29.284 "num_base_bdevs_operational": 2, 00:30:29.284 "base_bdevs_list": [ 00:30:29.284 { 00:30:29.284 "name": "BaseBdev1", 00:30:29.284 "uuid": "700c5f32-0fdb-4b96-b37a-8a44f56d49e3", 00:30:29.284 "is_configured": true, 00:30:29.284 "data_offset": 256, 00:30:29.284 "data_size": 7936 00:30:29.284 }, 00:30:29.284 { 00:30:29.284 "name": "BaseBdev2", 00:30:29.284 "uuid": "4dc12a9b-93fa-413b-8629-0905b3911991", 00:30:29.284 "is_configured": true, 00:30:29.284 "data_offset": 256, 00:30:29.284 "data_size": 7936 00:30:29.284 } 00:30:29.284 ] 00:30:29.284 } 00:30:29.284 } 00:30:29.284 }' 00:30:29.284 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:29.284 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:30:29.284 BaseBdev2' 00:30:29.284 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:29.284 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:30:29.284 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:29.543 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:29.543 "name": "BaseBdev1", 00:30:29.543 "aliases": [ 00:30:29.543 "700c5f32-0fdb-4b96-b37a-8a44f56d49e3" 00:30:29.543 ], 00:30:29.543 "product_name": "Malloc disk", 00:30:29.543 "block_size": 4128, 00:30:29.543 "num_blocks": 8192, 00:30:29.543 "uuid": "700c5f32-0fdb-4b96-b37a-8a44f56d49e3", 00:30:29.543 "md_size": 32, 00:30:29.543 "md_interleave": true, 00:30:29.543 "dif_type": 0, 00:30:29.543 "assigned_rate_limits": { 00:30:29.543 "rw_ios_per_sec": 0, 00:30:29.543 "rw_mbytes_per_sec": 0, 00:30:29.543 "r_mbytes_per_sec": 0, 00:30:29.543 "w_mbytes_per_sec": 0 00:30:29.543 }, 00:30:29.543 "claimed": true, 00:30:29.543 "claim_type": "exclusive_write", 00:30:29.543 "zoned": false, 00:30:29.543 "supported_io_types": { 00:30:29.543 "read": true, 00:30:29.543 "write": true, 00:30:29.543 "unmap": true, 00:30:29.543 "flush": true, 00:30:29.543 "reset": true, 00:30:29.543 "nvme_admin": false, 00:30:29.543 "nvme_io": false, 00:30:29.543 "nvme_io_md": false, 00:30:29.543 "write_zeroes": true, 00:30:29.543 "zcopy": true, 00:30:29.543 "get_zone_info": false, 00:30:29.543 "zone_management": false, 00:30:29.543 "zone_append": false, 00:30:29.543 "compare": false, 00:30:29.543 "compare_and_write": false, 00:30:29.543 "abort": true, 00:30:29.543 "seek_hole": false, 00:30:29.543 "seek_data": false, 00:30:29.543 "copy": true, 00:30:29.543 "nvme_iov_md": false 00:30:29.543 }, 00:30:29.543 "memory_domains": [ 00:30:29.543 { 00:30:29.543 "dma_device_id": "system", 00:30:29.543 "dma_device_type": 1 00:30:29.543 }, 00:30:29.543 { 00:30:29.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:29.543 "dma_device_type": 2 00:30:29.543 } 00:30:29.543 ], 00:30:29.543 "driver_specific": {} 00:30:29.543 }' 00:30:29.543 12:11:42 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:29.543 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:29.543 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:29.543 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:29.802 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:29.802 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:29.802 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:29.802 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:29.802 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:29.802 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:29.802 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.061 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:30.061 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:30.061 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:30:30.061 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:30.321 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:30.321 "name": "BaseBdev2", 00:30:30.321 "aliases": [ 00:30:30.321 "4dc12a9b-93fa-413b-8629-0905b3911991" 00:30:30.321 ], 00:30:30.321 "product_name": "Malloc disk", 00:30:30.321 "block_size": 4128, 00:30:30.321 "num_blocks": 8192, 00:30:30.321 "uuid": "4dc12a9b-93fa-413b-8629-0905b3911991", 00:30:30.321 "md_size": 32, 00:30:30.321 "md_interleave": true, 00:30:30.321 "dif_type": 0, 00:30:30.321 "assigned_rate_limits": { 00:30:30.321 "rw_ios_per_sec": 0, 00:30:30.321 "rw_mbytes_per_sec": 0, 00:30:30.321 "r_mbytes_per_sec": 0, 00:30:30.321 "w_mbytes_per_sec": 0 00:30:30.321 }, 00:30:30.321 "claimed": true, 00:30:30.321 "claim_type": "exclusive_write", 00:30:30.321 "zoned": false, 00:30:30.321 "supported_io_types": { 00:30:30.321 "read": true, 00:30:30.321 "write": true, 00:30:30.321 "unmap": true, 00:30:30.321 "flush": true, 00:30:30.321 "reset": true, 00:30:30.321 "nvme_admin": false, 00:30:30.321 "nvme_io": false, 00:30:30.321 "nvme_io_md": false, 00:30:30.321 "write_zeroes": true, 00:30:30.321 "zcopy": true, 00:30:30.321 "get_zone_info": false, 00:30:30.321 "zone_management": false, 00:30:30.321 "zone_append": false, 00:30:30.321 "compare": false, 00:30:30.321 "compare_and_write": false, 00:30:30.321 "abort": true, 00:30:30.321 "seek_hole": false, 00:30:30.321 "seek_data": false, 00:30:30.321 "copy": true, 00:30:30.321 "nvme_iov_md": false 00:30:30.321 }, 00:30:30.321 "memory_domains": [ 00:30:30.321 { 00:30:30.321 "dma_device_id": "system", 00:30:30.321 "dma_device_type": 1 00:30:30.321 }, 00:30:30.321 { 00:30:30.321 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:30.321 "dma_device_type": 2 00:30:30.321 } 00:30:30.321 ], 00:30:30.321 "driver_specific": {} 00:30:30.321 }' 00:30:30.321 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:30.321 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:30.321 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:30.321 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.321 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:30.321 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:30.321 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.321 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:30.580 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:30.580 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.580 12:11:43 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:30.580 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:30.580 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:30:30.839 [2024-07-15 12:11:44.252154] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:30.839 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:30:31.098 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:31.098 "name": "Existed_Raid", 00:30:31.098 "uuid": "43a9b142-fbb2-4c79-abe8-82273c85249c", 00:30:31.098 "strip_size_kb": 0, 00:30:31.098 "state": "online", 00:30:31.098 "raid_level": "raid1", 00:30:31.098 "superblock": true, 00:30:31.098 "num_base_bdevs": 2, 00:30:31.098 "num_base_bdevs_discovered": 1, 00:30:31.098 "num_base_bdevs_operational": 1, 00:30:31.098 "base_bdevs_list": [ 00:30:31.098 { 00:30:31.098 "name": null, 00:30:31.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:31.098 "is_configured": false, 00:30:31.098 "data_offset": 256, 00:30:31.098 "data_size": 7936 00:30:31.098 }, 00:30:31.098 { 00:30:31.098 "name": "BaseBdev2", 00:30:31.098 "uuid": "4dc12a9b-93fa-413b-8629-0905b3911991", 00:30:31.098 "is_configured": true, 00:30:31.098 "data_offset": 256, 00:30:31.098 "data_size": 7936 00:30:31.098 } 00:30:31.098 ] 00:30:31.098 }' 00:30:31.098 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:31.098 12:11:44 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:32.034 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:30:32.034 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:32.034 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:32.034 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:30:32.292 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:30:32.292 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:30:32.293 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:30:32.293 [2024-07-15 12:11:45.877034] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:30:32.293 [2024-07-15 12:11:45.877189] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:32.552 [2024-07-15 12:11:45.893426] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:32.552 [2024-07-15 12:11:45.893487] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:32.552 [2024-07-15 12:11:45.893511] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x116bb60 name Existed_Raid, state offline 00:30:32.552 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:30:32.552 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:30:32.552 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:32.552 12:11:45 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1614558 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1614558 ']' 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1614558 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1614558 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1614558' 00:30:32.812 killing process with pid 1614558 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1614558 00:30:32.812 [2024-07-15 12:11:46.216254] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:32.812 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1614558 00:30:32.812 [2024-07-15 12:11:46.217498] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:33.072 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:30:33.072 00:30:33.072 real 0m11.092s 00:30:33.072 user 0m19.548s 00:30:33.072 sys 0m2.096s 00:30:33.072 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:33.072 12:11:46 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:33.072 ************************************ 00:30:33.072 END TEST raid_state_function_test_sb_md_interleaved 00:30:33.072 ************************************ 00:30:33.072 12:11:46 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:33.072 12:11:46 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:30:33.072 12:11:46 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 4 -le 1 ']' 00:30:33.072 12:11:46 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:33.072 12:11:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:33.072 ************************************ 00:30:33.072 START TEST raid_superblock_test_md_interleaved 00:30:33.072 ************************************ 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1123 -- # raid_superblock_test raid1 2 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=1616185 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 1616185 /var/tmp/spdk-raid.sock 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1616185 ']' 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:33.072 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:33.072 12:11:46 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:33.332 [2024-07-15 12:11:46.704919] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:30:33.332 [2024-07-15 12:11:46.704984] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1616185 ] 00:30:33.332 [2024-07-15 12:11:46.833385] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:33.591 [2024-07-15 12:11:46.938229] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:33.591 [2024-07-15 12:11:47.000521] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:33.591 [2024-07-15 12:11:47.000552] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:33.591 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:33.591 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:30:33.591 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:30:33.591 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:33.591 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:30:33.591 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:30:33.591 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:30:33.592 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:33.592 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:33.592 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:33.592 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:30:33.850 malloc1 00:30:33.851 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:34.110 [2024-07-15 12:11:47.647204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:34.110 [2024-07-15 12:11:47.647256] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:34.110 [2024-07-15 12:11:47.647285] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x208cb90 00:30:34.110 [2024-07-15 12:11:47.647298] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:34.110 [2024-07-15 12:11:47.648883] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:34.110 [2024-07-15 12:11:47.648911] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:34.110 pt1 00:30:34.110 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:34.110 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:34.110 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:30:34.110 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:30:34.110 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:30:34.110 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:30:34.110 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:30:34.110 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:30:34.110 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:30:34.369 malloc2 00:30:34.369 12:11:47 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:34.629 [2024-07-15 12:11:48.142808] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:34.629 [2024-07-15 12:11:48.142855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:34.629 [2024-07-15 12:11:48.142875] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x207f420 00:30:34.629 [2024-07-15 12:11:48.142887] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:34.629 [2024-07-15 12:11:48.144386] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:34.629 [2024-07-15 12:11:48.144414] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:34.629 pt2 00:30:34.629 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:30:34.629 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:30:34.629 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:30:34.888 [2024-07-15 12:11:48.387465] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:34.888 [2024-07-15 12:11:48.388936] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:34.888 [2024-07-15 12:11:48.389092] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2072fa0 00:30:34.888 [2024-07-15 12:11:48.389106] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:34.888 [2024-07-15 12:11:48.389176] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1eeff50 00:30:34.888 [2024-07-15 12:11:48.389256] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2072fa0 00:30:34.888 [2024-07-15 12:11:48.389267] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2072fa0 00:30:34.888 [2024-07-15 12:11:48.389324] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:34.888 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:35.147 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:35.147 "name": "raid_bdev1", 00:30:35.147 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:35.147 "strip_size_kb": 0, 00:30:35.147 "state": "online", 00:30:35.147 "raid_level": "raid1", 00:30:35.147 "superblock": true, 00:30:35.147 "num_base_bdevs": 2, 00:30:35.147 "num_base_bdevs_discovered": 2, 00:30:35.147 "num_base_bdevs_operational": 2, 00:30:35.147 "base_bdevs_list": [ 00:30:35.147 { 00:30:35.147 "name": "pt1", 00:30:35.147 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:35.147 "is_configured": true, 00:30:35.147 "data_offset": 256, 00:30:35.147 "data_size": 7936 00:30:35.147 }, 00:30:35.147 { 00:30:35.147 "name": "pt2", 00:30:35.147 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:35.147 "is_configured": true, 00:30:35.147 "data_offset": 256, 00:30:35.147 "data_size": 7936 00:30:35.147 } 00:30:35.147 ] 00:30:35.147 }' 00:30:35.147 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:35.148 12:11:48 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:35.715 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:30:35.715 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:35.715 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:35.715 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:35.715 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:35.715 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:35.715 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:35.974 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:36.234 [2024-07-15 12:11:49.582872] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:36.234 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:36.234 "name": "raid_bdev1", 00:30:36.234 "aliases": [ 00:30:36.234 "d9207563-b757-4b0d-9cc6-ee13a0b82b7d" 00:30:36.234 ], 00:30:36.234 "product_name": "Raid Volume", 00:30:36.234 "block_size": 4128, 00:30:36.234 "num_blocks": 7936, 00:30:36.234 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:36.234 "md_size": 32, 00:30:36.234 "md_interleave": true, 00:30:36.234 "dif_type": 0, 00:30:36.234 "assigned_rate_limits": { 00:30:36.234 "rw_ios_per_sec": 0, 00:30:36.234 "rw_mbytes_per_sec": 0, 00:30:36.234 "r_mbytes_per_sec": 0, 00:30:36.234 "w_mbytes_per_sec": 0 00:30:36.234 }, 00:30:36.234 "claimed": false, 00:30:36.234 "zoned": false, 00:30:36.234 "supported_io_types": { 00:30:36.234 "read": true, 00:30:36.234 "write": true, 00:30:36.234 "unmap": false, 00:30:36.234 "flush": false, 00:30:36.234 "reset": true, 00:30:36.234 "nvme_admin": false, 00:30:36.234 "nvme_io": false, 00:30:36.234 "nvme_io_md": false, 00:30:36.234 "write_zeroes": true, 00:30:36.234 "zcopy": false, 00:30:36.234 "get_zone_info": false, 00:30:36.234 "zone_management": false, 00:30:36.234 "zone_append": false, 00:30:36.234 "compare": false, 00:30:36.234 "compare_and_write": false, 00:30:36.234 "abort": false, 00:30:36.234 "seek_hole": false, 00:30:36.234 "seek_data": false, 00:30:36.234 "copy": false, 00:30:36.234 "nvme_iov_md": false 00:30:36.234 }, 00:30:36.234 "memory_domains": [ 00:30:36.234 { 00:30:36.234 "dma_device_id": "system", 00:30:36.234 "dma_device_type": 1 00:30:36.234 }, 00:30:36.234 { 00:30:36.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.234 "dma_device_type": 2 00:30:36.234 }, 00:30:36.234 { 00:30:36.234 "dma_device_id": "system", 00:30:36.234 "dma_device_type": 1 00:30:36.234 }, 00:30:36.234 { 00:30:36.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.234 "dma_device_type": 2 00:30:36.234 } 00:30:36.234 ], 00:30:36.234 "driver_specific": { 00:30:36.234 "raid": { 00:30:36.234 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:36.234 "strip_size_kb": 0, 00:30:36.234 "state": "online", 00:30:36.234 "raid_level": "raid1", 00:30:36.234 "superblock": true, 00:30:36.234 "num_base_bdevs": 2, 00:30:36.234 "num_base_bdevs_discovered": 2, 00:30:36.234 "num_base_bdevs_operational": 2, 00:30:36.234 "base_bdevs_list": [ 00:30:36.234 { 00:30:36.234 "name": "pt1", 00:30:36.234 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:36.234 "is_configured": true, 00:30:36.234 "data_offset": 256, 00:30:36.234 "data_size": 7936 00:30:36.234 }, 00:30:36.234 { 00:30:36.234 "name": "pt2", 00:30:36.234 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:36.234 "is_configured": true, 00:30:36.234 "data_offset": 256, 00:30:36.234 "data_size": 7936 00:30:36.234 } 00:30:36.234 ] 00:30:36.234 } 00:30:36.234 } 00:30:36.234 }' 00:30:36.235 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:36.235 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:36.235 pt2' 00:30:36.235 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:36.235 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:36.235 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:36.494 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:36.494 "name": "pt1", 00:30:36.494 "aliases": [ 00:30:36.494 "00000000-0000-0000-0000-000000000001" 00:30:36.494 ], 00:30:36.494 "product_name": "passthru", 00:30:36.494 "block_size": 4128, 00:30:36.494 "num_blocks": 8192, 00:30:36.494 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:36.494 "md_size": 32, 00:30:36.494 "md_interleave": true, 00:30:36.494 "dif_type": 0, 00:30:36.494 "assigned_rate_limits": { 00:30:36.494 "rw_ios_per_sec": 0, 00:30:36.494 "rw_mbytes_per_sec": 0, 00:30:36.494 "r_mbytes_per_sec": 0, 00:30:36.494 "w_mbytes_per_sec": 0 00:30:36.494 }, 00:30:36.494 "claimed": true, 00:30:36.494 "claim_type": "exclusive_write", 00:30:36.494 "zoned": false, 00:30:36.494 "supported_io_types": { 00:30:36.494 "read": true, 00:30:36.494 "write": true, 00:30:36.494 "unmap": true, 00:30:36.494 "flush": true, 00:30:36.494 "reset": true, 00:30:36.494 "nvme_admin": false, 00:30:36.494 "nvme_io": false, 00:30:36.494 "nvme_io_md": false, 00:30:36.494 "write_zeroes": true, 00:30:36.494 "zcopy": true, 00:30:36.494 "get_zone_info": false, 00:30:36.494 "zone_management": false, 00:30:36.494 "zone_append": false, 00:30:36.494 "compare": false, 00:30:36.494 "compare_and_write": false, 00:30:36.494 "abort": true, 00:30:36.494 "seek_hole": false, 00:30:36.494 "seek_data": false, 00:30:36.494 "copy": true, 00:30:36.494 "nvme_iov_md": false 00:30:36.494 }, 00:30:36.494 "memory_domains": [ 00:30:36.494 { 00:30:36.494 "dma_device_id": "system", 00:30:36.494 "dma_device_type": 1 00:30:36.494 }, 00:30:36.494 { 00:30:36.494 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:36.494 "dma_device_type": 2 00:30:36.494 } 00:30:36.494 ], 00:30:36.494 "driver_specific": { 00:30:36.494 "passthru": { 00:30:36.494 "name": "pt1", 00:30:36.494 "base_bdev_name": "malloc1" 00:30:36.494 } 00:30:36.494 } 00:30:36.494 }' 00:30:36.494 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:36.494 12:11:49 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:36.494 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:36.494 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:36.494 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:36.753 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:37.013 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:37.013 "name": "pt2", 00:30:37.013 "aliases": [ 00:30:37.013 "00000000-0000-0000-0000-000000000002" 00:30:37.013 ], 00:30:37.013 "product_name": "passthru", 00:30:37.013 "block_size": 4128, 00:30:37.013 "num_blocks": 8192, 00:30:37.013 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:37.013 "md_size": 32, 00:30:37.013 "md_interleave": true, 00:30:37.013 "dif_type": 0, 00:30:37.013 "assigned_rate_limits": { 00:30:37.013 "rw_ios_per_sec": 0, 00:30:37.013 "rw_mbytes_per_sec": 0, 00:30:37.013 "r_mbytes_per_sec": 0, 00:30:37.013 "w_mbytes_per_sec": 0 00:30:37.013 }, 00:30:37.013 "claimed": true, 00:30:37.013 "claim_type": "exclusive_write", 00:30:37.013 "zoned": false, 00:30:37.013 "supported_io_types": { 00:30:37.013 "read": true, 00:30:37.013 "write": true, 00:30:37.013 "unmap": true, 00:30:37.013 "flush": true, 00:30:37.013 "reset": true, 00:30:37.013 "nvme_admin": false, 00:30:37.013 "nvme_io": false, 00:30:37.013 "nvme_io_md": false, 00:30:37.013 "write_zeroes": true, 00:30:37.013 "zcopy": true, 00:30:37.013 "get_zone_info": false, 00:30:37.013 "zone_management": false, 00:30:37.013 "zone_append": false, 00:30:37.013 "compare": false, 00:30:37.013 "compare_and_write": false, 00:30:37.013 "abort": true, 00:30:37.013 "seek_hole": false, 00:30:37.013 "seek_data": false, 00:30:37.013 "copy": true, 00:30:37.013 "nvme_iov_md": false 00:30:37.013 }, 00:30:37.013 "memory_domains": [ 00:30:37.013 { 00:30:37.013 "dma_device_id": "system", 00:30:37.013 "dma_device_type": 1 00:30:37.013 }, 00:30:37.013 { 00:30:37.013 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:37.013 "dma_device_type": 2 00:30:37.013 } 00:30:37.013 ], 00:30:37.013 "driver_specific": { 00:30:37.013 "passthru": { 00:30:37.013 "name": "pt2", 00:30:37.013 "base_bdev_name": "malloc2" 00:30:37.013 } 00:30:37.013 } 00:30:37.013 }' 00:30:37.013 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:37.013 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:37.273 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:37.273 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:37.273 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:37.273 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:37.273 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:37.273 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:37.273 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:37.273 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:37.531 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:37.531 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:37.531 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:37.531 12:11:50 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:30:37.791 [2024-07-15 12:11:51.179117] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:37.791 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=d9207563-b757-4b0d-9cc6-ee13a0b82b7d 00:30:37.791 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z d9207563-b757-4b0d-9cc6-ee13a0b82b7d ']' 00:30:37.791 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:38.050 [2024-07-15 12:11:51.423496] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:38.050 [2024-07-15 12:11:51.423514] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:38.050 [2024-07-15 12:11:51.423565] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:38.050 [2024-07-15 12:11:51.423620] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:38.050 [2024-07-15 12:11:51.423632] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2072fa0 name raid_bdev1, state offline 00:30:38.050 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:38.050 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:30:38.309 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:30:38.309 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:30:38.309 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:38.309 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:38.569 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:30:38.569 12:11:51 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:38.828 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:30:38.828 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:30:39.088 [2024-07-15 12:11:52.658725] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:30:39.088 [2024-07-15 12:11:52.660109] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:30:39.088 [2024-07-15 12:11:52.660167] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:30:39.088 [2024-07-15 12:11:52.660207] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:30:39.088 [2024-07-15 12:11:52.660225] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:39.088 [2024-07-15 12:11:52.660235] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ef13b0 name raid_bdev1, state configuring 00:30:39.088 request: 00:30:39.088 { 00:30:39.088 "name": "raid_bdev1", 00:30:39.088 "raid_level": "raid1", 00:30:39.088 "base_bdevs": [ 00:30:39.088 "malloc1", 00:30:39.088 "malloc2" 00:30:39.088 ], 00:30:39.088 "superblock": false, 00:30:39.088 "method": "bdev_raid_create", 00:30:39.088 "req_id": 1 00:30:39.088 } 00:30:39.088 Got JSON-RPC error response 00:30:39.088 response: 00:30:39.088 { 00:30:39.088 "code": -17, 00:30:39.088 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:30:39.088 } 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:39.088 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:30:39.347 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:30:39.347 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:30:39.606 12:11:52 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:39.606 [2024-07-15 12:11:53.099828] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:39.606 [2024-07-15 12:11:53.099871] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:39.606 [2024-07-15 12:11:53.099888] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x207f650 00:30:39.606 [2024-07-15 12:11:53.099900] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:39.606 [2024-07-15 12:11:53.101288] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:39.606 [2024-07-15 12:11:53.101315] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:39.606 [2024-07-15 12:11:53.101360] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:39.606 [2024-07-15 12:11:53.101383] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:39.606 pt1 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:39.606 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:39.863 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:39.863 "name": "raid_bdev1", 00:30:39.863 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:39.863 "strip_size_kb": 0, 00:30:39.863 "state": "configuring", 00:30:39.863 "raid_level": "raid1", 00:30:39.863 "superblock": true, 00:30:39.863 "num_base_bdevs": 2, 00:30:39.863 "num_base_bdevs_discovered": 1, 00:30:39.863 "num_base_bdevs_operational": 2, 00:30:39.863 "base_bdevs_list": [ 00:30:39.863 { 00:30:39.863 "name": "pt1", 00:30:39.863 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:39.863 "is_configured": true, 00:30:39.863 "data_offset": 256, 00:30:39.863 "data_size": 7936 00:30:39.863 }, 00:30:39.863 { 00:30:39.863 "name": null, 00:30:39.863 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:39.863 "is_configured": false, 00:30:39.863 "data_offset": 256, 00:30:39.863 "data_size": 7936 00:30:39.863 } 00:30:39.863 ] 00:30:39.863 }' 00:30:39.863 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:39.863 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:40.428 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:30:40.428 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:30:40.428 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:40.428 12:11:53 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:40.689 [2024-07-15 12:11:54.166663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:40.689 [2024-07-15 12:11:54.166717] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:40.689 [2024-07-15 12:11:54.166736] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2073cf0 00:30:40.689 [2024-07-15 12:11:54.166748] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:40.689 [2024-07-15 12:11:54.166910] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:40.689 [2024-07-15 12:11:54.166926] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:40.689 [2024-07-15 12:11:54.166967] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:40.689 [2024-07-15 12:11:54.166984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:40.689 [2024-07-15 12:11:54.167066] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x20751b0 00:30:40.689 [2024-07-15 12:11:54.167076] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:40.689 [2024-07-15 12:11:54.167134] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2076470 00:30:40.689 [2024-07-15 12:11:54.167207] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20751b0 00:30:40.689 [2024-07-15 12:11:54.167217] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20751b0 00:30:40.689 [2024-07-15 12:11:54.167275] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:40.689 pt2 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:40.689 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:40.949 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:40.949 "name": "raid_bdev1", 00:30:40.949 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:40.949 "strip_size_kb": 0, 00:30:40.949 "state": "online", 00:30:40.949 "raid_level": "raid1", 00:30:40.949 "superblock": true, 00:30:40.949 "num_base_bdevs": 2, 00:30:40.949 "num_base_bdevs_discovered": 2, 00:30:40.949 "num_base_bdevs_operational": 2, 00:30:40.949 "base_bdevs_list": [ 00:30:40.949 { 00:30:40.949 "name": "pt1", 00:30:40.949 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:40.949 "is_configured": true, 00:30:40.949 "data_offset": 256, 00:30:40.949 "data_size": 7936 00:30:40.949 }, 00:30:40.949 { 00:30:40.949 "name": "pt2", 00:30:40.949 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:40.949 "is_configured": true, 00:30:40.949 "data_offset": 256, 00:30:40.949 "data_size": 7936 00:30:40.949 } 00:30:40.949 ] 00:30:40.949 }' 00:30:40.949 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:40.949 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:41.516 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:30:41.517 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:30:41.517 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:30:41.517 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:30:41.517 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:30:41.517 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:30:41.517 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:41.517 12:11:54 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:30:41.776 [2024-07-15 12:11:55.201674] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:41.776 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:30:41.776 "name": "raid_bdev1", 00:30:41.776 "aliases": [ 00:30:41.776 "d9207563-b757-4b0d-9cc6-ee13a0b82b7d" 00:30:41.776 ], 00:30:41.776 "product_name": "Raid Volume", 00:30:41.776 "block_size": 4128, 00:30:41.776 "num_blocks": 7936, 00:30:41.776 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:41.776 "md_size": 32, 00:30:41.776 "md_interleave": true, 00:30:41.776 "dif_type": 0, 00:30:41.776 "assigned_rate_limits": { 00:30:41.776 "rw_ios_per_sec": 0, 00:30:41.776 "rw_mbytes_per_sec": 0, 00:30:41.776 "r_mbytes_per_sec": 0, 00:30:41.776 "w_mbytes_per_sec": 0 00:30:41.776 }, 00:30:41.776 "claimed": false, 00:30:41.776 "zoned": false, 00:30:41.776 "supported_io_types": { 00:30:41.776 "read": true, 00:30:41.776 "write": true, 00:30:41.776 "unmap": false, 00:30:41.776 "flush": false, 00:30:41.776 "reset": true, 00:30:41.776 "nvme_admin": false, 00:30:41.776 "nvme_io": false, 00:30:41.776 "nvme_io_md": false, 00:30:41.776 "write_zeroes": true, 00:30:41.776 "zcopy": false, 00:30:41.776 "get_zone_info": false, 00:30:41.776 "zone_management": false, 00:30:41.776 "zone_append": false, 00:30:41.776 "compare": false, 00:30:41.776 "compare_and_write": false, 00:30:41.776 "abort": false, 00:30:41.776 "seek_hole": false, 00:30:41.776 "seek_data": false, 00:30:41.776 "copy": false, 00:30:41.776 "nvme_iov_md": false 00:30:41.776 }, 00:30:41.776 "memory_domains": [ 00:30:41.776 { 00:30:41.776 "dma_device_id": "system", 00:30:41.776 "dma_device_type": 1 00:30:41.776 }, 00:30:41.776 { 00:30:41.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:41.776 "dma_device_type": 2 00:30:41.776 }, 00:30:41.776 { 00:30:41.776 "dma_device_id": "system", 00:30:41.776 "dma_device_type": 1 00:30:41.776 }, 00:30:41.776 { 00:30:41.776 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:41.776 "dma_device_type": 2 00:30:41.776 } 00:30:41.776 ], 00:30:41.776 "driver_specific": { 00:30:41.776 "raid": { 00:30:41.776 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:41.776 "strip_size_kb": 0, 00:30:41.776 "state": "online", 00:30:41.776 "raid_level": "raid1", 00:30:41.776 "superblock": true, 00:30:41.776 "num_base_bdevs": 2, 00:30:41.776 "num_base_bdevs_discovered": 2, 00:30:41.776 "num_base_bdevs_operational": 2, 00:30:41.776 "base_bdevs_list": [ 00:30:41.776 { 00:30:41.776 "name": "pt1", 00:30:41.776 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:41.776 "is_configured": true, 00:30:41.776 "data_offset": 256, 00:30:41.776 "data_size": 7936 00:30:41.776 }, 00:30:41.776 { 00:30:41.776 "name": "pt2", 00:30:41.776 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:41.776 "is_configured": true, 00:30:41.776 "data_offset": 256, 00:30:41.776 "data_size": 7936 00:30:41.776 } 00:30:41.776 ] 00:30:41.776 } 00:30:41.776 } 00:30:41.776 }' 00:30:41.776 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:30:41.776 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:30:41.776 pt2' 00:30:41.777 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:41.777 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:30:41.777 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:42.035 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:42.035 "name": "pt1", 00:30:42.035 "aliases": [ 00:30:42.035 "00000000-0000-0000-0000-000000000001" 00:30:42.035 ], 00:30:42.035 "product_name": "passthru", 00:30:42.035 "block_size": 4128, 00:30:42.035 "num_blocks": 8192, 00:30:42.035 "uuid": "00000000-0000-0000-0000-000000000001", 00:30:42.035 "md_size": 32, 00:30:42.035 "md_interleave": true, 00:30:42.035 "dif_type": 0, 00:30:42.035 "assigned_rate_limits": { 00:30:42.035 "rw_ios_per_sec": 0, 00:30:42.035 "rw_mbytes_per_sec": 0, 00:30:42.035 "r_mbytes_per_sec": 0, 00:30:42.035 "w_mbytes_per_sec": 0 00:30:42.035 }, 00:30:42.035 "claimed": true, 00:30:42.035 "claim_type": "exclusive_write", 00:30:42.035 "zoned": false, 00:30:42.035 "supported_io_types": { 00:30:42.035 "read": true, 00:30:42.035 "write": true, 00:30:42.035 "unmap": true, 00:30:42.035 "flush": true, 00:30:42.035 "reset": true, 00:30:42.035 "nvme_admin": false, 00:30:42.035 "nvme_io": false, 00:30:42.035 "nvme_io_md": false, 00:30:42.035 "write_zeroes": true, 00:30:42.035 "zcopy": true, 00:30:42.035 "get_zone_info": false, 00:30:42.035 "zone_management": false, 00:30:42.035 "zone_append": false, 00:30:42.035 "compare": false, 00:30:42.035 "compare_and_write": false, 00:30:42.035 "abort": true, 00:30:42.035 "seek_hole": false, 00:30:42.035 "seek_data": false, 00:30:42.035 "copy": true, 00:30:42.035 "nvme_iov_md": false 00:30:42.035 }, 00:30:42.035 "memory_domains": [ 00:30:42.035 { 00:30:42.035 "dma_device_id": "system", 00:30:42.035 "dma_device_type": 1 00:30:42.035 }, 00:30:42.035 { 00:30:42.035 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:42.035 "dma_device_type": 2 00:30:42.035 } 00:30:42.035 ], 00:30:42.035 "driver_specific": { 00:30:42.035 "passthru": { 00:30:42.035 "name": "pt1", 00:30:42.035 "base_bdev_name": "malloc1" 00:30:42.035 } 00:30:42.035 } 00:30:42.035 }' 00:30:42.035 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:42.035 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:42.035 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:42.035 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:42.292 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:42.292 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:42.292 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:42.292 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:42.292 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:42.292 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:42.292 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:42.550 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:42.550 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:30:42.550 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:30:42.550 12:11:55 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:30:42.809 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:30:42.809 "name": "pt2", 00:30:42.809 "aliases": [ 00:30:42.809 "00000000-0000-0000-0000-000000000002" 00:30:42.809 ], 00:30:42.809 "product_name": "passthru", 00:30:42.809 "block_size": 4128, 00:30:42.809 "num_blocks": 8192, 00:30:42.809 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:42.809 "md_size": 32, 00:30:42.809 "md_interleave": true, 00:30:42.809 "dif_type": 0, 00:30:42.809 "assigned_rate_limits": { 00:30:42.809 "rw_ios_per_sec": 0, 00:30:42.809 "rw_mbytes_per_sec": 0, 00:30:42.809 "r_mbytes_per_sec": 0, 00:30:42.809 "w_mbytes_per_sec": 0 00:30:42.809 }, 00:30:42.809 "claimed": true, 00:30:42.809 "claim_type": "exclusive_write", 00:30:42.809 "zoned": false, 00:30:42.809 "supported_io_types": { 00:30:42.809 "read": true, 00:30:42.809 "write": true, 00:30:42.809 "unmap": true, 00:30:42.809 "flush": true, 00:30:42.809 "reset": true, 00:30:42.809 "nvme_admin": false, 00:30:42.809 "nvme_io": false, 00:30:42.809 "nvme_io_md": false, 00:30:42.809 "write_zeroes": true, 00:30:42.809 "zcopy": true, 00:30:42.809 "get_zone_info": false, 00:30:42.809 "zone_management": false, 00:30:42.809 "zone_append": false, 00:30:42.809 "compare": false, 00:30:42.809 "compare_and_write": false, 00:30:42.809 "abort": true, 00:30:42.809 "seek_hole": false, 00:30:42.809 "seek_data": false, 00:30:42.809 "copy": true, 00:30:42.809 "nvme_iov_md": false 00:30:42.809 }, 00:30:42.809 "memory_domains": [ 00:30:42.809 { 00:30:42.809 "dma_device_id": "system", 00:30:42.809 "dma_device_type": 1 00:30:42.809 }, 00:30:42.809 { 00:30:42.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:30:42.809 "dma_device_type": 2 00:30:42.809 } 00:30:42.809 ], 00:30:42.809 "driver_specific": { 00:30:42.809 "passthru": { 00:30:42.809 "name": "pt2", 00:30:42.809 "base_bdev_name": "malloc2" 00:30:42.809 } 00:30:42.809 } 00:30:42.809 }' 00:30:42.809 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:42.809 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:30:42.809 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:30:42.809 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:43.067 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:30:43.067 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:30:43.067 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:43.067 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:30:43.067 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:30:43.067 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:43.324 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:30:43.324 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:30:43.324 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:43.324 12:11:56 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:30:43.890 [2024-07-15 12:11:57.219043] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:43.890 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' d9207563-b757-4b0d-9cc6-ee13a0b82b7d '!=' d9207563-b757-4b0d-9cc6-ee13a0b82b7d ']' 00:30:43.890 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:30:43.890 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:30:43.890 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:30:43.890 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:30:44.152 [2024-07-15 12:11:57.527623] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:44.152 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:44.483 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:44.483 "name": "raid_bdev1", 00:30:44.483 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:44.483 "strip_size_kb": 0, 00:30:44.483 "state": "online", 00:30:44.483 "raid_level": "raid1", 00:30:44.483 "superblock": true, 00:30:44.483 "num_base_bdevs": 2, 00:30:44.483 "num_base_bdevs_discovered": 1, 00:30:44.483 "num_base_bdevs_operational": 1, 00:30:44.483 "base_bdevs_list": [ 00:30:44.483 { 00:30:44.483 "name": null, 00:30:44.483 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:44.483 "is_configured": false, 00:30:44.483 "data_offset": 256, 00:30:44.483 "data_size": 7936 00:30:44.483 }, 00:30:44.483 { 00:30:44.483 "name": "pt2", 00:30:44.483 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:44.483 "is_configured": true, 00:30:44.483 "data_offset": 256, 00:30:44.483 "data_size": 7936 00:30:44.483 } 00:30:44.483 ] 00:30:44.483 }' 00:30:44.483 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:44.483 12:11:57 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:45.049 12:11:58 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:45.617 [2024-07-15 12:11:59.003494] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:45.617 [2024-07-15 12:11:59.003523] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:45.617 [2024-07-15 12:11:59.003582] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:45.617 [2024-07-15 12:11:59.003620] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:45.617 [2024-07-15 12:11:59.003629] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20751b0 name raid_bdev1, state offline 00:30:45.617 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:45.617 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:30:45.876 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:30:45.876 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:30:45.876 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:30:45.876 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:30:45.876 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:30:46.135 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:30:46.135 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:30:46.135 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:30:46.135 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:30:46.135 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:30:46.136 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:30:46.394 [2024-07-15 12:11:59.825547] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:30:46.394 [2024-07-15 12:11:59.825589] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:46.394 [2024-07-15 12:11:59.825602] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ef1720 00:30:46.394 [2024-07-15 12:11:59.825610] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:46.394 [2024-07-15 12:11:59.827029] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:46.394 [2024-07-15 12:11:59.827056] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:30:46.394 [2024-07-15 12:11:59.827103] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:30:46.394 [2024-07-15 12:11:59.827127] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:46.395 [2024-07-15 12:11:59.827187] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2075430 00:30:46.395 [2024-07-15 12:11:59.827195] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:46.395 [2024-07-15 12:11:59.827244] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ef1a40 00:30:46.395 [2024-07-15 12:11:59.827302] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2075430 00:30:46.395 [2024-07-15 12:11:59.827309] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2075430 00:30:46.395 [2024-07-15 12:11:59.827352] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:46.395 pt2 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:46.395 12:11:59 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:46.653 12:12:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:46.653 "name": "raid_bdev1", 00:30:46.653 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:46.653 "strip_size_kb": 0, 00:30:46.653 "state": "online", 00:30:46.653 "raid_level": "raid1", 00:30:46.653 "superblock": true, 00:30:46.653 "num_base_bdevs": 2, 00:30:46.653 "num_base_bdevs_discovered": 1, 00:30:46.653 "num_base_bdevs_operational": 1, 00:30:46.653 "base_bdevs_list": [ 00:30:46.653 { 00:30:46.653 "name": null, 00:30:46.653 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:46.653 "is_configured": false, 00:30:46.653 "data_offset": 256, 00:30:46.653 "data_size": 7936 00:30:46.653 }, 00:30:46.653 { 00:30:46.653 "name": "pt2", 00:30:46.653 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:46.653 "is_configured": true, 00:30:46.653 "data_offset": 256, 00:30:46.653 "data_size": 7936 00:30:46.653 } 00:30:46.653 ] 00:30:46.653 }' 00:30:46.653 12:12:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:46.653 12:12:00 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:47.591 12:12:00 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:30:47.592 [2024-07-15 12:12:01.048718] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:47.592 [2024-07-15 12:12:01.048743] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:30:47.592 [2024-07-15 12:12:01.048794] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:47.592 [2024-07-15 12:12:01.048830] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:47.592 [2024-07-15 12:12:01.048839] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2075430 name raid_bdev1, state offline 00:30:47.592 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:47.592 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:30:47.850 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:30:47.850 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:30:47.850 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:30:47.850 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:30:48.109 [2024-07-15 12:12:01.561995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:30:48.109 [2024-07-15 12:12:01.562030] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:48.109 [2024-07-15 12:12:01.562044] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x208d360 00:30:48.109 [2024-07-15 12:12:01.562052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:48.109 [2024-07-15 12:12:01.563329] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:48.109 [2024-07-15 12:12:01.563352] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:30:48.109 [2024-07-15 12:12:01.563389] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:30:48.109 [2024-07-15 12:12:01.563409] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:30:48.109 [2024-07-15 12:12:01.563468] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:30:48.109 [2024-07-15 12:12:01.563477] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:30:48.109 [2024-07-15 12:12:01.563492] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2076580 name raid_bdev1, state configuring 00:30:48.109 [2024-07-15 12:12:01.563511] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:30:48.109 [2024-07-15 12:12:01.563550] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2076580 00:30:48.109 [2024-07-15 12:12:01.563557] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:48.109 [2024-07-15 12:12:01.563594] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2073380 00:30:48.109 [2024-07-15 12:12:01.563645] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2076580 00:30:48.109 [2024-07-15 12:12:01.563651] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2076580 00:30:48.109 [2024-07-15 12:12:01.563699] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:48.109 pt1 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:48.109 12:12:01 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:48.678 12:12:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:48.678 "name": "raid_bdev1", 00:30:48.678 "uuid": "d9207563-b757-4b0d-9cc6-ee13a0b82b7d", 00:30:48.678 "strip_size_kb": 0, 00:30:48.678 "state": "online", 00:30:48.678 "raid_level": "raid1", 00:30:48.678 "superblock": true, 00:30:48.678 "num_base_bdevs": 2, 00:30:48.678 "num_base_bdevs_discovered": 1, 00:30:48.678 "num_base_bdevs_operational": 1, 00:30:48.678 "base_bdevs_list": [ 00:30:48.678 { 00:30:48.678 "name": null, 00:30:48.678 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:48.678 "is_configured": false, 00:30:48.678 "data_offset": 256, 00:30:48.678 "data_size": 7936 00:30:48.678 }, 00:30:48.678 { 00:30:48.678 "name": "pt2", 00:30:48.678 "uuid": "00000000-0000-0000-0000-000000000002", 00:30:48.678 "is_configured": true, 00:30:48.678 "data_offset": 256, 00:30:48.678 "data_size": 7936 00:30:48.678 } 00:30:48.678 ] 00:30:48.678 }' 00:30:48.678 12:12:02 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:48.678 12:12:02 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:49.615 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:30:49.615 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:30:50.183 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:30:50.183 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:50.183 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:30:50.442 [2024-07-15 12:12:03.868117] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' d9207563-b757-4b0d-9cc6-ee13a0b82b7d '!=' d9207563-b757-4b0d-9cc6-ee13a0b82b7d ']' 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 1616185 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1616185 ']' 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1616185 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1616185 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1616185' 00:30:50.443 killing process with pid 1616185 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # kill 1616185 00:30:50.443 [2024-07-15 12:12:03.954017] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:30:50.443 [2024-07-15 12:12:03.954068] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:30:50.443 [2024-07-15 12:12:03.954105] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:30:50.443 [2024-07-15 12:12:03.954113] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2076580 name raid_bdev1, state offline 00:30:50.443 12:12:03 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@972 -- # wait 1616185 00:30:50.443 [2024-07-15 12:12:03.994008] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:30:51.011 12:12:04 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:30:51.011 00:30:51.011 real 0m17.742s 00:30:51.011 user 0m32.801s 00:30:51.011 sys 0m3.127s 00:30:51.011 12:12:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:30:51.011 12:12:04 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:51.011 ************************************ 00:30:51.011 END TEST raid_superblock_test_md_interleaved 00:30:51.011 ************************************ 00:30:51.011 12:12:04 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:30:51.012 12:12:04 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:30:51.012 12:12:04 bdev_raid -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:30:51.012 12:12:04 bdev_raid -- common/autotest_common.sh@1105 -- # xtrace_disable 00:30:51.012 12:12:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:30:51.012 ************************************ 00:30:51.012 START TEST raid_rebuild_test_sb_md_interleaved 00:30:51.012 ************************************ 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1123 -- # raid_rebuild_test raid1 2 true false false 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=1618989 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 1618989 /var/tmp/spdk-raid.sock 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@829 -- # '[' -z 1618989 ']' 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local max_retries=100 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:30:51.012 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # xtrace_disable 00:30:51.012 12:12:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:51.012 [2024-07-15 12:12:04.541723] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:30:51.012 [2024-07-15 12:12:04.541790] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1618989 ] 00:30:51.012 I/O size of 3145728 is greater than zero copy threshold (65536). 00:30:51.012 Zero copy mechanism will not be used. 00:30:51.271 [2024-07-15 12:12:04.669268] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:51.271 [2024-07-15 12:12:04.771208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:51.271 [2024-07-15 12:12:04.836949] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:51.271 [2024-07-15 12:12:04.836986] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:30:52.208 12:12:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:30:52.208 12:12:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@862 -- # return 0 00:30:52.208 12:12:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:52.208 12:12:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:30:52.777 BaseBdev1_malloc 00:30:52.777 12:12:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:30:53.036 [2024-07-15 12:12:06.575482] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:30:53.036 [2024-07-15 12:12:06.575533] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:53.037 [2024-07-15 12:12:06.575556] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2579250 00:30:53.037 [2024-07-15 12:12:06.575569] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:53.037 [2024-07-15 12:12:06.577125] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:53.037 [2024-07-15 12:12:06.577153] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:30:53.037 BaseBdev1 00:30:53.037 12:12:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:30:53.037 12:12:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:30:53.605 BaseBdev2_malloc 00:30:53.605 12:12:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:30:53.605 [2024-07-15 12:12:07.151285] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:30:53.606 [2024-07-15 12:12:07.151333] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:53.606 [2024-07-15 12:12:07.151354] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2706b30 00:30:53.606 [2024-07-15 12:12:07.151367] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:53.606 [2024-07-15 12:12:07.152761] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:53.606 [2024-07-15 12:12:07.152787] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:30:53.606 BaseBdev2 00:30:53.606 12:12:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:30:54.175 spare_malloc 00:30:54.175 12:12:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:30:54.744 spare_delay 00:30:54.744 12:12:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:30:55.312 [2024-07-15 12:12:08.705693] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:30:55.312 [2024-07-15 12:12:08.705743] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:30:55.312 [2024-07-15 12:12:08.705762] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26fb540 00:30:55.312 [2024-07-15 12:12:08.705774] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:30:55.312 [2024-07-15 12:12:08.707215] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:30:55.312 [2024-07-15 12:12:08.707241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:30:55.312 spare 00:30:55.312 12:12:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:30:55.880 [2024-07-15 12:12:09.219062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:30:55.880 [2024-07-15 12:12:09.220379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:30:55.880 [2024-07-15 12:12:09.220536] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2704f90 00:30:55.880 [2024-07-15 12:12:09.220549] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:30:55.881 [2024-07-15 12:12:09.220617] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x256f650 00:30:55.881 [2024-07-15 12:12:09.220705] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2704f90 00:30:55.881 [2024-07-15 12:12:09.220715] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2704f90 00:30:55.881 [2024-07-15 12:12:09.220772] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:55.881 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:56.449 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:56.449 "name": "raid_bdev1", 00:30:56.449 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:30:56.449 "strip_size_kb": 0, 00:30:56.449 "state": "online", 00:30:56.449 "raid_level": "raid1", 00:30:56.449 "superblock": true, 00:30:56.449 "num_base_bdevs": 2, 00:30:56.449 "num_base_bdevs_discovered": 2, 00:30:56.449 "num_base_bdevs_operational": 2, 00:30:56.449 "base_bdevs_list": [ 00:30:56.449 { 00:30:56.449 "name": "BaseBdev1", 00:30:56.449 "uuid": "c151486c-e1b1-5913-a829-5be8ff3a7d2a", 00:30:56.449 "is_configured": true, 00:30:56.449 "data_offset": 256, 00:30:56.449 "data_size": 7936 00:30:56.449 }, 00:30:56.449 { 00:30:56.449 "name": "BaseBdev2", 00:30:56.449 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:30:56.449 "is_configured": true, 00:30:56.449 "data_offset": 256, 00:30:56.449 "data_size": 7936 00:30:56.449 } 00:30:56.449 ] 00:30:56.449 }' 00:30:56.449 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:56.449 12:12:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:57.017 12:12:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:30:57.017 12:12:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:30:57.017 [2024-07-15 12:12:10.591063] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:30:57.276 12:12:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:30:57.276 12:12:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:57.276 12:12:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:30:57.276 12:12:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:30:57.276 12:12:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:30:57.276 12:12:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:30:57.277 12:12:10 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:30:57.536 [2024-07-15 12:12:11.100134] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:30:57.796 "name": "raid_bdev1", 00:30:57.796 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:30:57.796 "strip_size_kb": 0, 00:30:57.796 "state": "online", 00:30:57.796 "raid_level": "raid1", 00:30:57.796 "superblock": true, 00:30:57.796 "num_base_bdevs": 2, 00:30:57.796 "num_base_bdevs_discovered": 1, 00:30:57.796 "num_base_bdevs_operational": 1, 00:30:57.796 "base_bdevs_list": [ 00:30:57.796 { 00:30:57.796 "name": null, 00:30:57.796 "uuid": "00000000-0000-0000-0000-000000000000", 00:30:57.796 "is_configured": false, 00:30:57.796 "data_offset": 256, 00:30:57.796 "data_size": 7936 00:30:57.796 }, 00:30:57.796 { 00:30:57.796 "name": "BaseBdev2", 00:30:57.796 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:30:57.796 "is_configured": true, 00:30:57.796 "data_offset": 256, 00:30:57.796 "data_size": 7936 00:30:57.796 } 00:30:57.796 ] 00:30:57.796 }' 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:30:57.796 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:30:58.733 12:12:11 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:30:58.733 [2024-07-15 12:12:12.219130] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:30:58.733 [2024-07-15 12:12:12.222756] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x256f490 00:30:58.733 [2024-07-15 12:12:12.224978] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:30:58.733 12:12:12 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:30:59.670 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:30:59.670 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:30:59.670 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:30:59.670 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:30:59.670 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:30:59.670 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:30:59.670 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:30:59.929 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:30:59.929 "name": "raid_bdev1", 00:30:59.929 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:30:59.929 "strip_size_kb": 0, 00:30:59.929 "state": "online", 00:30:59.929 "raid_level": "raid1", 00:30:59.929 "superblock": true, 00:30:59.929 "num_base_bdevs": 2, 00:30:59.929 "num_base_bdevs_discovered": 2, 00:30:59.929 "num_base_bdevs_operational": 2, 00:30:59.929 "process": { 00:30:59.929 "type": "rebuild", 00:30:59.929 "target": "spare", 00:30:59.929 "progress": { 00:30:59.929 "blocks": 3072, 00:30:59.929 "percent": 38 00:30:59.929 } 00:30:59.929 }, 00:30:59.929 "base_bdevs_list": [ 00:30:59.929 { 00:30:59.929 "name": "spare", 00:30:59.929 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:30:59.929 "is_configured": true, 00:30:59.929 "data_offset": 256, 00:30:59.929 "data_size": 7936 00:30:59.929 }, 00:30:59.929 { 00:30:59.929 "name": "BaseBdev2", 00:30:59.929 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:30:59.929 "is_configured": true, 00:30:59.929 "data_offset": 256, 00:30:59.929 "data_size": 7936 00:30:59.929 } 00:30:59.929 ] 00:30:59.929 }' 00:30:59.929 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:00.189 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:00.189 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:00.189 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:00.189 12:12:13 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:31:00.759 [2024-07-15 12:12:14.135703] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:00.759 [2024-07-15 12:12:14.140187] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:00.759 [2024-07-15 12:12:14.140234] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:00.759 [2024-07-15 12:12:14.140249] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:00.759 [2024-07-15 12:12:14.140258] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:00.759 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:01.018 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:01.018 "name": "raid_bdev1", 00:31:01.018 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:01.018 "strip_size_kb": 0, 00:31:01.018 "state": "online", 00:31:01.018 "raid_level": "raid1", 00:31:01.018 "superblock": true, 00:31:01.018 "num_base_bdevs": 2, 00:31:01.018 "num_base_bdevs_discovered": 1, 00:31:01.018 "num_base_bdevs_operational": 1, 00:31:01.018 "base_bdevs_list": [ 00:31:01.018 { 00:31:01.018 "name": null, 00:31:01.018 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:01.018 "is_configured": false, 00:31:01.018 "data_offset": 256, 00:31:01.018 "data_size": 7936 00:31:01.018 }, 00:31:01.018 { 00:31:01.018 "name": "BaseBdev2", 00:31:01.018 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:01.018 "is_configured": true, 00:31:01.018 "data_offset": 256, 00:31:01.018 "data_size": 7936 00:31:01.018 } 00:31:01.018 ] 00:31:01.018 }' 00:31:01.018 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:01.018 12:12:14 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:01.956 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:01.956 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:01.956 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:01.956 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:01.956 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:01.956 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:01.956 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:01.956 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:01.956 "name": "raid_bdev1", 00:31:01.956 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:01.956 "strip_size_kb": 0, 00:31:01.956 "state": "online", 00:31:01.956 "raid_level": "raid1", 00:31:01.956 "superblock": true, 00:31:01.956 "num_base_bdevs": 2, 00:31:01.956 "num_base_bdevs_discovered": 1, 00:31:01.956 "num_base_bdevs_operational": 1, 00:31:01.956 "base_bdevs_list": [ 00:31:01.956 { 00:31:01.956 "name": null, 00:31:01.956 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:01.956 "is_configured": false, 00:31:01.956 "data_offset": 256, 00:31:01.956 "data_size": 7936 00:31:01.956 }, 00:31:01.956 { 00:31:01.956 "name": "BaseBdev2", 00:31:01.956 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:01.956 "is_configured": true, 00:31:01.956 "data_offset": 256, 00:31:01.956 "data_size": 7936 00:31:01.956 } 00:31:01.956 ] 00:31:01.956 }' 00:31:01.956 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:02.215 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:02.215 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:02.215 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:02.215 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:31:02.474 [2024-07-15 12:12:15.829050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:02.474 [2024-07-15 12:12:15.832611] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2252fc0 00:31:02.474 [2024-07-15 12:12:15.834070] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:02.474 12:12:15 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:31:03.411 12:12:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:03.412 12:12:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:03.412 12:12:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:03.412 12:12:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:03.412 12:12:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:03.412 12:12:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:03.412 12:12:16 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:03.670 "name": "raid_bdev1", 00:31:03.670 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:03.670 "strip_size_kb": 0, 00:31:03.670 "state": "online", 00:31:03.670 "raid_level": "raid1", 00:31:03.670 "superblock": true, 00:31:03.670 "num_base_bdevs": 2, 00:31:03.670 "num_base_bdevs_discovered": 2, 00:31:03.670 "num_base_bdevs_operational": 2, 00:31:03.670 "process": { 00:31:03.670 "type": "rebuild", 00:31:03.670 "target": "spare", 00:31:03.670 "progress": { 00:31:03.670 "blocks": 3072, 00:31:03.670 "percent": 38 00:31:03.670 } 00:31:03.670 }, 00:31:03.670 "base_bdevs_list": [ 00:31:03.670 { 00:31:03.670 "name": "spare", 00:31:03.670 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:03.670 "is_configured": true, 00:31:03.670 "data_offset": 256, 00:31:03.670 "data_size": 7936 00:31:03.670 }, 00:31:03.670 { 00:31:03.670 "name": "BaseBdev2", 00:31:03.670 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:03.670 "is_configured": true, 00:31:03.670 "data_offset": 256, 00:31:03.670 "data_size": 7936 00:31:03.670 } 00:31:03.670 ] 00:31:03.670 }' 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:31:03.670 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1171 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:03.670 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:03.929 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:03.929 "name": "raid_bdev1", 00:31:03.929 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:03.929 "strip_size_kb": 0, 00:31:03.929 "state": "online", 00:31:03.929 "raid_level": "raid1", 00:31:03.929 "superblock": true, 00:31:03.929 "num_base_bdevs": 2, 00:31:03.929 "num_base_bdevs_discovered": 2, 00:31:03.929 "num_base_bdevs_operational": 2, 00:31:03.929 "process": { 00:31:03.929 "type": "rebuild", 00:31:03.929 "target": "spare", 00:31:03.929 "progress": { 00:31:03.929 "blocks": 4096, 00:31:03.929 "percent": 51 00:31:03.929 } 00:31:03.929 }, 00:31:03.929 "base_bdevs_list": [ 00:31:03.929 { 00:31:03.929 "name": "spare", 00:31:03.929 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:03.929 "is_configured": true, 00:31:03.929 "data_offset": 256, 00:31:03.929 "data_size": 7936 00:31:03.929 }, 00:31:03.929 { 00:31:03.929 "name": "BaseBdev2", 00:31:03.929 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:03.929 "is_configured": true, 00:31:03.929 "data_offset": 256, 00:31:03.929 "data_size": 7936 00:31:03.929 } 00:31:03.929 ] 00:31:03.929 }' 00:31:03.929 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:04.187 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:04.187 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:04.187 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:04.187 12:12:17 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:31:05.123 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:31:05.123 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:05.123 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:05.123 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:05.123 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:05.123 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:05.123 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:05.123 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:05.381 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:05.381 "name": "raid_bdev1", 00:31:05.381 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:05.381 "strip_size_kb": 0, 00:31:05.381 "state": "online", 00:31:05.381 "raid_level": "raid1", 00:31:05.381 "superblock": true, 00:31:05.381 "num_base_bdevs": 2, 00:31:05.381 "num_base_bdevs_discovered": 2, 00:31:05.381 "num_base_bdevs_operational": 2, 00:31:05.381 "process": { 00:31:05.381 "type": "rebuild", 00:31:05.381 "target": "spare", 00:31:05.381 "progress": { 00:31:05.381 "blocks": 7168, 00:31:05.381 "percent": 90 00:31:05.381 } 00:31:05.381 }, 00:31:05.381 "base_bdevs_list": [ 00:31:05.381 { 00:31:05.381 "name": "spare", 00:31:05.381 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:05.381 "is_configured": true, 00:31:05.381 "data_offset": 256, 00:31:05.381 "data_size": 7936 00:31:05.381 }, 00:31:05.381 { 00:31:05.381 "name": "BaseBdev2", 00:31:05.381 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:05.381 "is_configured": true, 00:31:05.381 "data_offset": 256, 00:31:05.381 "data_size": 7936 00:31:05.381 } 00:31:05.381 ] 00:31:05.381 }' 00:31:05.381 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:05.381 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:05.381 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:05.381 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:05.381 12:12:18 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:31:05.381 [2024-07-15 12:12:18.957727] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:31:05.381 [2024-07-15 12:12:18.957780] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:31:05.381 [2024-07-15 12:12:18.957861] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:06.317 12:12:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:31:06.317 12:12:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:06.317 12:12:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:06.317 12:12:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:06.317 12:12:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:06.317 12:12:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:06.317 12:12:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:06.318 12:12:19 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:06.885 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:06.885 "name": "raid_bdev1", 00:31:06.885 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:06.885 "strip_size_kb": 0, 00:31:06.885 "state": "online", 00:31:06.885 "raid_level": "raid1", 00:31:06.885 "superblock": true, 00:31:06.885 "num_base_bdevs": 2, 00:31:06.885 "num_base_bdevs_discovered": 2, 00:31:06.885 "num_base_bdevs_operational": 2, 00:31:06.885 "base_bdevs_list": [ 00:31:06.885 { 00:31:06.885 "name": "spare", 00:31:06.885 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:06.885 "is_configured": true, 00:31:06.885 "data_offset": 256, 00:31:06.885 "data_size": 7936 00:31:06.885 }, 00:31:06.885 { 00:31:06.885 "name": "BaseBdev2", 00:31:06.885 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:06.885 "is_configured": true, 00:31:06.885 "data_offset": 256, 00:31:06.885 "data_size": 7936 00:31:06.885 } 00:31:06.885 ] 00:31:06.885 }' 00:31:06.885 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:06.885 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:31:06.885 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:07.144 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:31:07.144 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:31:07.144 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:07.144 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:07.144 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:07.144 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:07.144 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:07.144 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:07.144 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:07.403 "name": "raid_bdev1", 00:31:07.403 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:07.403 "strip_size_kb": 0, 00:31:07.403 "state": "online", 00:31:07.403 "raid_level": "raid1", 00:31:07.403 "superblock": true, 00:31:07.403 "num_base_bdevs": 2, 00:31:07.403 "num_base_bdevs_discovered": 2, 00:31:07.403 "num_base_bdevs_operational": 2, 00:31:07.403 "base_bdevs_list": [ 00:31:07.403 { 00:31:07.403 "name": "spare", 00:31:07.403 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:07.403 "is_configured": true, 00:31:07.403 "data_offset": 256, 00:31:07.403 "data_size": 7936 00:31:07.403 }, 00:31:07.403 { 00:31:07.403 "name": "BaseBdev2", 00:31:07.403 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:07.403 "is_configured": true, 00:31:07.403 "data_offset": 256, 00:31:07.403 "data_size": 7936 00:31:07.403 } 00:31:07.403 ] 00:31:07.403 }' 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:07.403 12:12:20 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:07.970 12:12:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:07.970 "name": "raid_bdev1", 00:31:07.970 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:07.970 "strip_size_kb": 0, 00:31:07.970 "state": "online", 00:31:07.970 "raid_level": "raid1", 00:31:07.970 "superblock": true, 00:31:07.970 "num_base_bdevs": 2, 00:31:07.970 "num_base_bdevs_discovered": 2, 00:31:07.970 "num_base_bdevs_operational": 2, 00:31:07.970 "base_bdevs_list": [ 00:31:07.970 { 00:31:07.970 "name": "spare", 00:31:07.970 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:07.970 "is_configured": true, 00:31:07.970 "data_offset": 256, 00:31:07.970 "data_size": 7936 00:31:07.970 }, 00:31:07.970 { 00:31:07.970 "name": "BaseBdev2", 00:31:07.970 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:07.970 "is_configured": true, 00:31:07.970 "data_offset": 256, 00:31:07.970 "data_size": 7936 00:31:07.970 } 00:31:07.970 ] 00:31:07.970 }' 00:31:07.970 12:12:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:07.970 12:12:21 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:08.537 12:12:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:31:09.104 [2024-07-15 12:12:22.499476] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:31:09.104 [2024-07-15 12:12:22.499505] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:31:09.104 [2024-07-15 12:12:22.499564] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:09.104 [2024-07-15 12:12:22.499619] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:09.104 [2024-07-15 12:12:22.499631] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2704f90 name raid_bdev1, state offline 00:31:09.104 12:12:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:09.104 12:12:22 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:31:09.682 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:31:09.682 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:31:09.682 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:31:09.682 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:09.989 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:31:10.259 [2024-07-15 12:12:23.790818] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:31:10.259 [2024-07-15 12:12:23.790862] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:10.259 [2024-07-15 12:12:23.790884] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2571ab0 00:31:10.259 [2024-07-15 12:12:23.790896] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:10.259 [2024-07-15 12:12:23.792695] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:10.259 [2024-07-15 12:12:23.792724] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:31:10.259 [2024-07-15 12:12:23.792781] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:31:10.259 [2024-07-15 12:12:23.792806] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:10.259 [2024-07-15 12:12:23.792890] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:31:10.259 spare 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:10.259 12:12:23 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:10.517 [2024-07-15 12:12:23.893195] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25779d0 00:31:10.517 [2024-07-15 12:12:23.893211] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:31:10.517 [2024-07-15 12:12:23.893287] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x256f490 00:31:10.517 [2024-07-15 12:12:23.893376] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25779d0 00:31:10.517 [2024-07-15 12:12:23.893386] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25779d0 00:31:10.517 [2024-07-15 12:12:23.893451] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:10.517 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:10.517 "name": "raid_bdev1", 00:31:10.517 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:10.517 "strip_size_kb": 0, 00:31:10.517 "state": "online", 00:31:10.517 "raid_level": "raid1", 00:31:10.517 "superblock": true, 00:31:10.517 "num_base_bdevs": 2, 00:31:10.517 "num_base_bdevs_discovered": 2, 00:31:10.517 "num_base_bdevs_operational": 2, 00:31:10.517 "base_bdevs_list": [ 00:31:10.517 { 00:31:10.517 "name": "spare", 00:31:10.517 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:10.517 "is_configured": true, 00:31:10.517 "data_offset": 256, 00:31:10.517 "data_size": 7936 00:31:10.517 }, 00:31:10.517 { 00:31:10.517 "name": "BaseBdev2", 00:31:10.517 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:10.517 "is_configured": true, 00:31:10.517 "data_offset": 256, 00:31:10.517 "data_size": 7936 00:31:10.517 } 00:31:10.517 ] 00:31:10.517 }' 00:31:10.517 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:10.517 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:11.082 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:11.082 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:11.082 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:11.082 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:11.082 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:11.082 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:11.082 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:11.340 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:11.340 "name": "raid_bdev1", 00:31:11.340 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:11.340 "strip_size_kb": 0, 00:31:11.340 "state": "online", 00:31:11.340 "raid_level": "raid1", 00:31:11.340 "superblock": true, 00:31:11.340 "num_base_bdevs": 2, 00:31:11.340 "num_base_bdevs_discovered": 2, 00:31:11.340 "num_base_bdevs_operational": 2, 00:31:11.340 "base_bdevs_list": [ 00:31:11.340 { 00:31:11.340 "name": "spare", 00:31:11.340 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:11.340 "is_configured": true, 00:31:11.340 "data_offset": 256, 00:31:11.340 "data_size": 7936 00:31:11.340 }, 00:31:11.340 { 00:31:11.340 "name": "BaseBdev2", 00:31:11.340 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:11.340 "is_configured": true, 00:31:11.340 "data_offset": 256, 00:31:11.340 "data_size": 7936 00:31:11.340 } 00:31:11.340 ] 00:31:11.340 }' 00:31:11.340 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:11.597 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:11.597 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:11.597 12:12:24 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:11.597 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:11.597 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:31:11.597 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:31:11.597 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:31:12.162 [2024-07-15 12:12:25.651866] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:12.162 12:12:25 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:12.728 12:12:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:12.728 "name": "raid_bdev1", 00:31:12.728 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:12.728 "strip_size_kb": 0, 00:31:12.728 "state": "online", 00:31:12.728 "raid_level": "raid1", 00:31:12.728 "superblock": true, 00:31:12.728 "num_base_bdevs": 2, 00:31:12.728 "num_base_bdevs_discovered": 1, 00:31:12.728 "num_base_bdevs_operational": 1, 00:31:12.728 "base_bdevs_list": [ 00:31:12.728 { 00:31:12.728 "name": null, 00:31:12.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:12.728 "is_configured": false, 00:31:12.728 "data_offset": 256, 00:31:12.728 "data_size": 7936 00:31:12.728 }, 00:31:12.728 { 00:31:12.728 "name": "BaseBdev2", 00:31:12.728 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:12.728 "is_configured": true, 00:31:12.728 "data_offset": 256, 00:31:12.728 "data_size": 7936 00:31:12.728 } 00:31:12.728 ] 00:31:12.728 }' 00:31:12.728 12:12:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:12.728 12:12:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:13.294 12:12:26 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:31:13.552 [2024-07-15 12:12:27.019504] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:13.553 [2024-07-15 12:12:27.019648] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:31:13.553 [2024-07-15 12:12:27.019664] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:31:13.553 [2024-07-15 12:12:27.019695] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:13.553 [2024-07-15 12:12:27.023140] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26fbe30 00:31:13.553 [2024-07-15 12:12:27.025339] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:13.553 12:12:27 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:31:14.488 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:14.488 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:14.488 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:14.488 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:14.488 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:14.488 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:14.488 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:14.748 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:14.748 "name": "raid_bdev1", 00:31:14.748 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:14.748 "strip_size_kb": 0, 00:31:14.748 "state": "online", 00:31:14.748 "raid_level": "raid1", 00:31:14.748 "superblock": true, 00:31:14.748 "num_base_bdevs": 2, 00:31:14.748 "num_base_bdevs_discovered": 2, 00:31:14.748 "num_base_bdevs_operational": 2, 00:31:14.748 "process": { 00:31:14.748 "type": "rebuild", 00:31:14.748 "target": "spare", 00:31:14.748 "progress": { 00:31:14.748 "blocks": 3072, 00:31:14.748 "percent": 38 00:31:14.748 } 00:31:14.748 }, 00:31:14.748 "base_bdevs_list": [ 00:31:14.748 { 00:31:14.748 "name": "spare", 00:31:14.748 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:14.748 "is_configured": true, 00:31:14.748 "data_offset": 256, 00:31:14.748 "data_size": 7936 00:31:14.748 }, 00:31:14.748 { 00:31:14.748 "name": "BaseBdev2", 00:31:14.748 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:14.748 "is_configured": true, 00:31:14.748 "data_offset": 256, 00:31:14.748 "data_size": 7936 00:31:14.748 } 00:31:14.748 ] 00:31:14.748 }' 00:31:14.748 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:14.748 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:14.748 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:15.006 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:15.006 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:15.266 [2024-07-15 12:12:28.614964] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:15.266 [2024-07-15 12:12:28.637889] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:15.266 [2024-07-15 12:12:28.637930] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:15.266 [2024-07-15 12:12:28.637945] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:15.266 [2024-07-15 12:12:28.637952] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:15.266 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:15.525 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:15.525 "name": "raid_bdev1", 00:31:15.525 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:15.525 "strip_size_kb": 0, 00:31:15.525 "state": "online", 00:31:15.525 "raid_level": "raid1", 00:31:15.525 "superblock": true, 00:31:15.525 "num_base_bdevs": 2, 00:31:15.525 "num_base_bdevs_discovered": 1, 00:31:15.525 "num_base_bdevs_operational": 1, 00:31:15.525 "base_bdevs_list": [ 00:31:15.525 { 00:31:15.525 "name": null, 00:31:15.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:15.525 "is_configured": false, 00:31:15.525 "data_offset": 256, 00:31:15.525 "data_size": 7936 00:31:15.525 }, 00:31:15.525 { 00:31:15.525 "name": "BaseBdev2", 00:31:15.525 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:15.525 "is_configured": true, 00:31:15.525 "data_offset": 256, 00:31:15.525 "data_size": 7936 00:31:15.525 } 00:31:15.525 ] 00:31:15.525 }' 00:31:15.525 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:15.525 12:12:28 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:16.091 12:12:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:31:16.355 [2024-07-15 12:12:29.788549] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:31:16.355 [2024-07-15 12:12:29.788600] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:16.355 [2024-07-15 12:12:29.788620] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2570b40 00:31:16.355 [2024-07-15 12:12:29.788632] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:16.355 [2024-07-15 12:12:29.788827] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:16.355 [2024-07-15 12:12:29.788843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:31:16.355 [2024-07-15 12:12:29.788896] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:31:16.355 [2024-07-15 12:12:29.788907] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:31:16.355 [2024-07-15 12:12:29.788917] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:31:16.355 [2024-07-15 12:12:29.788935] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:31:16.355 [2024-07-15 12:12:29.792373] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2571d40 00:31:16.355 [2024-07-15 12:12:29.793851] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:31:16.355 spare 00:31:16.355 12:12:29 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:31:17.293 12:12:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:31:17.293 12:12:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:17.293 12:12:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:31:17.293 12:12:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:31:17.294 12:12:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:17.294 12:12:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:17.294 12:12:30 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:17.552 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:17.552 "name": "raid_bdev1", 00:31:17.552 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:17.552 "strip_size_kb": 0, 00:31:17.552 "state": "online", 00:31:17.552 "raid_level": "raid1", 00:31:17.552 "superblock": true, 00:31:17.552 "num_base_bdevs": 2, 00:31:17.552 "num_base_bdevs_discovered": 2, 00:31:17.552 "num_base_bdevs_operational": 2, 00:31:17.552 "process": { 00:31:17.552 "type": "rebuild", 00:31:17.552 "target": "spare", 00:31:17.552 "progress": { 00:31:17.552 "blocks": 3072, 00:31:17.552 "percent": 38 00:31:17.552 } 00:31:17.552 }, 00:31:17.552 "base_bdevs_list": [ 00:31:17.552 { 00:31:17.552 "name": "spare", 00:31:17.552 "uuid": "f9d0edcb-bc98-5b87-ae6b-1109659240ca", 00:31:17.552 "is_configured": true, 00:31:17.552 "data_offset": 256, 00:31:17.552 "data_size": 7936 00:31:17.552 }, 00:31:17.552 { 00:31:17.552 "name": "BaseBdev2", 00:31:17.552 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:17.552 "is_configured": true, 00:31:17.552 "data_offset": 256, 00:31:17.552 "data_size": 7936 00:31:17.552 } 00:31:17.552 ] 00:31:17.552 }' 00:31:17.552 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:17.552 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:31:17.552 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:17.811 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:31:17.811 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:31:17.811 [2024-07-15 12:12:31.314898] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:17.811 [2024-07-15 12:12:31.406425] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:31:17.811 [2024-07-15 12:12:31.406479] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:31:17.811 [2024-07-15 12:12:31.406494] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:31:17.811 [2024-07-15 12:12:31.406502] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:18.071 "name": "raid_bdev1", 00:31:18.071 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:18.071 "strip_size_kb": 0, 00:31:18.071 "state": "online", 00:31:18.071 "raid_level": "raid1", 00:31:18.071 "superblock": true, 00:31:18.071 "num_base_bdevs": 2, 00:31:18.071 "num_base_bdevs_discovered": 1, 00:31:18.071 "num_base_bdevs_operational": 1, 00:31:18.071 "base_bdevs_list": [ 00:31:18.071 { 00:31:18.071 "name": null, 00:31:18.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:18.071 "is_configured": false, 00:31:18.071 "data_offset": 256, 00:31:18.071 "data_size": 7936 00:31:18.071 }, 00:31:18.071 { 00:31:18.071 "name": "BaseBdev2", 00:31:18.071 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:18.071 "is_configured": true, 00:31:18.071 "data_offset": 256, 00:31:18.071 "data_size": 7936 00:31:18.071 } 00:31:18.071 ] 00:31:18.071 }' 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:18.071 12:12:31 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:19.007 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:19.008 "name": "raid_bdev1", 00:31:19.008 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:19.008 "strip_size_kb": 0, 00:31:19.008 "state": "online", 00:31:19.008 "raid_level": "raid1", 00:31:19.008 "superblock": true, 00:31:19.008 "num_base_bdevs": 2, 00:31:19.008 "num_base_bdevs_discovered": 1, 00:31:19.008 "num_base_bdevs_operational": 1, 00:31:19.008 "base_bdevs_list": [ 00:31:19.008 { 00:31:19.008 "name": null, 00:31:19.008 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:19.008 "is_configured": false, 00:31:19.008 "data_offset": 256, 00:31:19.008 "data_size": 7936 00:31:19.008 }, 00:31:19.008 { 00:31:19.008 "name": "BaseBdev2", 00:31:19.008 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:19.008 "is_configured": true, 00:31:19.008 "data_offset": 256, 00:31:19.008 "data_size": 7936 00:31:19.008 } 00:31:19.008 ] 00:31:19.008 }' 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:19.008 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:31:19.267 12:12:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:31:19.526 [2024-07-15 12:12:33.034289] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:31:19.526 [2024-07-15 12:12:33.034335] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:31:19.526 [2024-07-15 12:12:33.034355] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2579480 00:31:19.526 [2024-07-15 12:12:33.034367] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:31:19.526 [2024-07-15 12:12:33.034535] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:31:19.526 [2024-07-15 12:12:33.034551] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:31:19.526 [2024-07-15 12:12:33.034594] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:31:19.526 [2024-07-15 12:12:33.034605] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:31:19.526 [2024-07-15 12:12:33.034615] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:31:19.526 BaseBdev1 00:31:19.526 12:12:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:20.901 "name": "raid_bdev1", 00:31:20.901 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:20.901 "strip_size_kb": 0, 00:31:20.901 "state": "online", 00:31:20.901 "raid_level": "raid1", 00:31:20.901 "superblock": true, 00:31:20.901 "num_base_bdevs": 2, 00:31:20.901 "num_base_bdevs_discovered": 1, 00:31:20.901 "num_base_bdevs_operational": 1, 00:31:20.901 "base_bdevs_list": [ 00:31:20.901 { 00:31:20.901 "name": null, 00:31:20.901 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:20.901 "is_configured": false, 00:31:20.901 "data_offset": 256, 00:31:20.901 "data_size": 7936 00:31:20.901 }, 00:31:20.901 { 00:31:20.901 "name": "BaseBdev2", 00:31:20.901 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:20.901 "is_configured": true, 00:31:20.901 "data_offset": 256, 00:31:20.901 "data_size": 7936 00:31:20.901 } 00:31:20.901 ] 00:31:20.901 }' 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:20.901 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:21.470 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:21.470 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:21.470 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:21.470 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:21.470 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:21.470 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:21.470 12:12:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:21.729 "name": "raid_bdev1", 00:31:21.729 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:21.729 "strip_size_kb": 0, 00:31:21.729 "state": "online", 00:31:21.729 "raid_level": "raid1", 00:31:21.729 "superblock": true, 00:31:21.729 "num_base_bdevs": 2, 00:31:21.729 "num_base_bdevs_discovered": 1, 00:31:21.729 "num_base_bdevs_operational": 1, 00:31:21.729 "base_bdevs_list": [ 00:31:21.729 { 00:31:21.729 "name": null, 00:31:21.729 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:21.729 "is_configured": false, 00:31:21.729 "data_offset": 256, 00:31:21.729 "data_size": 7936 00:31:21.729 }, 00:31:21.729 { 00:31:21.729 "name": "BaseBdev2", 00:31:21.729 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:21.729 "is_configured": true, 00:31:21.729 "data_offset": 256, 00:31:21.729 "data_size": 7936 00:31:21.729 } 00:31:21.729 ] 00:31:21.729 }' 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@648 -- # local es=0 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@636 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@640 -- # case "$(type -t "$arg")" in 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:31:21.729 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:31:21.988 [2024-07-15 12:12:35.440705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:31:21.988 [2024-07-15 12:12:35.440828] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:31:21.988 [2024-07-15 12:12:35.440843] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:31:21.988 request: 00:31:21.988 { 00:31:21.988 "base_bdev": "BaseBdev1", 00:31:21.988 "raid_bdev": "raid_bdev1", 00:31:21.988 "method": "bdev_raid_add_base_bdev", 00:31:21.988 "req_id": 1 00:31:21.988 } 00:31:21.988 Got JSON-RPC error response 00:31:21.988 response: 00:31:21.988 { 00:31:21.988 "code": -22, 00:31:21.988 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:31:21.988 } 00:31:21.988 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # es=1 00:31:21.988 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@659 -- # (( es > 128 )) 00:31:21.988 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@670 -- # [[ -n '' ]] 00:31:21.988 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@675 -- # (( !es == 0 )) 00:31:21.988 12:12:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:22.927 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:23.186 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:31:23.186 "name": "raid_bdev1", 00:31:23.186 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:23.186 "strip_size_kb": 0, 00:31:23.186 "state": "online", 00:31:23.186 "raid_level": "raid1", 00:31:23.186 "superblock": true, 00:31:23.186 "num_base_bdevs": 2, 00:31:23.186 "num_base_bdevs_discovered": 1, 00:31:23.186 "num_base_bdevs_operational": 1, 00:31:23.186 "base_bdevs_list": [ 00:31:23.186 { 00:31:23.186 "name": null, 00:31:23.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:23.186 "is_configured": false, 00:31:23.186 "data_offset": 256, 00:31:23.186 "data_size": 7936 00:31:23.186 }, 00:31:23.186 { 00:31:23.186 "name": "BaseBdev2", 00:31:23.186 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:23.186 "is_configured": true, 00:31:23.186 "data_offset": 256, 00:31:23.186 "data_size": 7936 00:31:23.186 } 00:31:23.186 ] 00:31:23.186 }' 00:31:23.186 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:31:23.186 12:12:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:23.754 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:31:23.754 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:31:23.754 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:31:23.754 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:31:23.754 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:31:23.754 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:31:23.754 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:31:24.013 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:31:24.013 "name": "raid_bdev1", 00:31:24.013 "uuid": "87fcdc8e-8f60-4298-ba4d-7086c0b5a98b", 00:31:24.013 "strip_size_kb": 0, 00:31:24.013 "state": "online", 00:31:24.013 "raid_level": "raid1", 00:31:24.013 "superblock": true, 00:31:24.013 "num_base_bdevs": 2, 00:31:24.013 "num_base_bdevs_discovered": 1, 00:31:24.013 "num_base_bdevs_operational": 1, 00:31:24.013 "base_bdevs_list": [ 00:31:24.013 { 00:31:24.013 "name": null, 00:31:24.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:31:24.013 "is_configured": false, 00:31:24.013 "data_offset": 256, 00:31:24.013 "data_size": 7936 00:31:24.013 }, 00:31:24.013 { 00:31:24.013 "name": "BaseBdev2", 00:31:24.013 "uuid": "62229a54-0b30-5c9c-bee8-b932fba7e0ff", 00:31:24.013 "is_configured": true, 00:31:24.013 "data_offset": 256, 00:31:24.013 "data_size": 7936 00:31:24.013 } 00:31:24.013 ] 00:31:24.013 }' 00:31:24.013 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:31:24.013 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:31:24.013 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 1618989 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@948 -- # '[' -z 1618989 ']' 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@952 -- # kill -0 1618989 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # uname 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1618989 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1618989' 00:31:24.273 killing process with pid 1618989 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # kill 1618989 00:31:24.273 Received shutdown signal, test time was about 60.000000 seconds 00:31:24.273 00:31:24.273 Latency(us) 00:31:24.273 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:24.273 =================================================================================================================== 00:31:24.273 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:31:24.273 [2024-07-15 12:12:37.717781] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:31:24.273 [2024-07-15 12:12:37.717866] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:31:24.273 [2024-07-15 12:12:37.717914] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:31:24.273 [2024-07-15 12:12:37.717927] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25779d0 name raid_bdev1, state offline 00:31:24.273 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@972 -- # wait 1618989 00:31:24.273 [2024-07-15 12:12:37.749532] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:31:24.533 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:31:24.533 00:31:24.533 real 0m33.505s 00:31:24.533 user 0m54.565s 00:31:24.533 sys 0m4.404s 00:31:24.533 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:24.533 12:12:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:31:24.533 ************************************ 00:31:24.533 END TEST raid_rebuild_test_sb_md_interleaved 00:31:24.533 ************************************ 00:31:24.533 12:12:38 bdev_raid -- common/autotest_common.sh@1142 -- # return 0 00:31:24.533 12:12:38 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:31:24.533 12:12:38 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:31:24.533 12:12:38 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1618989 ']' 00:31:24.533 12:12:38 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1618989 00:31:24.533 12:12:38 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:31:24.533 00:31:24.533 real 19m21.247s 00:31:24.533 user 32m55.000s 00:31:24.533 sys 3m31.525s 00:31:24.533 12:12:38 bdev_raid -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:24.533 12:12:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:31:24.533 ************************************ 00:31:24.533 END TEST bdev_raid 00:31:24.533 ************************************ 00:31:24.533 12:12:38 -- common/autotest_common.sh@1142 -- # return 0 00:31:24.533 12:12:38 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:31:24.533 12:12:38 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:31:24.533 12:12:38 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:24.533 12:12:38 -- common/autotest_common.sh@10 -- # set +x 00:31:24.793 ************************************ 00:31:24.793 START TEST bdevperf_config 00:31:24.793 ************************************ 00:31:24.793 12:12:38 bdevperf_config -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:31:24.793 * Looking for test storage... 00:31:24.793 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:24.793 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:24.793 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:24.793 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:24.793 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:24.793 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:24.793 12:12:38 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:28.086 12:12:41 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-15 12:12:38.341964] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:28.086 [2024-07-15 12:12:38.342031] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623946 ] 00:31:28.086 Using job config with 4 jobs 00:31:28.086 [2024-07-15 12:12:38.492285] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:28.086 [2024-07-15 12:12:38.608479] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:28.086 cpumask for '\''job0'\'' is too big 00:31:28.086 cpumask for '\''job1'\'' is too big 00:31:28.086 cpumask for '\''job2'\'' is too big 00:31:28.086 cpumask for '\''job3'\'' is too big 00:31:28.086 Running I/O for 2 seconds... 00:31:28.086 00:31:28.086 Latency(us) 00:31:28.086 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 23998.17 23.44 0.00 0.00 10651.42 1866.35 16412.49 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 23976.18 23.41 0.00 0.00 10637.45 1837.86 14474.91 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 23954.34 23.39 0.00 0.00 10623.21 1852.10 13050.21 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 24026.90 23.46 0.00 0.00 10568.22 933.18 12993.22 00:31:28.086 =================================================================================================================== 00:31:28.086 Total : 95955.58 93.71 0.00 0.00 10620.01 933.18 16412.49' 00:31:28.086 12:12:41 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-15 12:12:38.341964] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:28.086 [2024-07-15 12:12:38.342031] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623946 ] 00:31:28.086 Using job config with 4 jobs 00:31:28.086 [2024-07-15 12:12:38.492285] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:28.086 [2024-07-15 12:12:38.608479] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:28.086 cpumask for '\''job0'\'' is too big 00:31:28.086 cpumask for '\''job1'\'' is too big 00:31:28.086 cpumask for '\''job2'\'' is too big 00:31:28.086 cpumask for '\''job3'\'' is too big 00:31:28.086 Running I/O for 2 seconds... 00:31:28.086 00:31:28.086 Latency(us) 00:31:28.086 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 23998.17 23.44 0.00 0.00 10651.42 1866.35 16412.49 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 23976.18 23.41 0.00 0.00 10637.45 1837.86 14474.91 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 23954.34 23.39 0.00 0.00 10623.21 1852.10 13050.21 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 24026.90 23.46 0.00 0.00 10568.22 933.18 12993.22 00:31:28.086 =================================================================================================================== 00:31:28.086 Total : 95955.58 93.71 0.00 0.00 10620.01 933.18 16412.49' 00:31:28.086 12:12:41 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 12:12:38.341964] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:28.086 [2024-07-15 12:12:38.342031] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1623946 ] 00:31:28.086 Using job config with 4 jobs 00:31:28.086 [2024-07-15 12:12:38.492285] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:28.086 [2024-07-15 12:12:38.608479] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:28.086 cpumask for '\''job0'\'' is too big 00:31:28.086 cpumask for '\''job1'\'' is too big 00:31:28.086 cpumask for '\''job2'\'' is too big 00:31:28.086 cpumask for '\''job3'\'' is too big 00:31:28.086 Running I/O for 2 seconds... 00:31:28.086 00:31:28.086 Latency(us) 00:31:28.086 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 23998.17 23.44 0.00 0.00 10651.42 1866.35 16412.49 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 23976.18 23.41 0.00 0.00 10637.45 1837.86 14474.91 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 23954.34 23.39 0.00 0.00 10623.21 1852.10 13050.21 00:31:28.086 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:28.086 Malloc0 : 2.02 24026.90 23.46 0.00 0.00 10568.22 933.18 12993.22 00:31:28.086 =================================================================================================================== 00:31:28.086 Total : 95955.58 93.71 0.00 0.00 10620.01 933.18 16412.49' 00:31:28.086 12:12:41 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:31:28.086 12:12:41 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:31:28.086 12:12:41 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:31:28.086 12:12:41 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:28.086 [2024-07-15 12:12:41.113316] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:28.086 [2024-07-15 12:12:41.113381] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624305 ] 00:31:28.086 [2024-07-15 12:12:41.254725] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:28.086 [2024-07-15 12:12:41.376028] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:28.086 cpumask for 'job0' is too big 00:31:28.086 cpumask for 'job1' is too big 00:31:28.086 cpumask for 'job2' is too big 00:31:28.086 cpumask for 'job3' is too big 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:31:30.621 Running I/O for 2 seconds... 00:31:30.621 00:31:30.621 Latency(us) 00:31:30.621 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:30.621 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:30.621 Malloc0 : 2.02 24136.32 23.57 0.00 0.00 10593.53 1852.10 16298.52 00:31:30.621 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:30.621 Malloc0 : 2.02 24114.26 23.55 0.00 0.00 10579.62 1837.86 14360.93 00:31:30.621 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:30.621 Malloc0 : 2.02 24155.29 23.59 0.00 0.00 10537.84 1837.86 12537.32 00:31:30.621 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:31:30.621 Malloc0 : 2.03 24133.36 23.57 0.00 0.00 10524.08 1837.86 10941.66 00:31:30.621 =================================================================================================================== 00:31:30.621 Total : 96539.23 94.28 0.00 0.00 10558.69 1837.86 16298.52' 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:30.621 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:30.621 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:30.621 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:30.621 12:12:43 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:33.153 12:12:46 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-15 12:12:43.898443] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:33.153 [2024-07-15 12:12:43.898513] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624658 ] 00:31:33.153 Using job config with 3 jobs 00:31:33.153 [2024-07-15 12:12:44.039405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:33.153 [2024-07-15 12:12:44.159453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:33.153 cpumask for '\''job0'\'' is too big 00:31:33.153 cpumask for '\''job1'\'' is too big 00:31:33.153 cpumask for '\''job2'\'' is too big 00:31:33.153 Running I/O for 2 seconds... 00:31:33.153 00:31:33.153 Latency(us) 00:31:33.153 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:33.153 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:33.153 Malloc0 : 2.01 32624.61 31.86 0.00 0.00 7832.30 1802.24 11511.54 00:31:33.153 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:33.153 Malloc0 : 2.02 32636.49 31.87 0.00 0.00 7811.53 1773.75 9687.93 00:31:33.153 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:33.153 Malloc0 : 2.02 32606.73 31.84 0.00 0.00 7801.70 1780.87 8206.25 00:31:33.153 =================================================================================================================== 00:31:33.153 Total : 97867.84 95.57 0.00 0.00 7815.15 1773.75 11511.54' 00:31:33.153 12:12:46 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-15 12:12:43.898443] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:33.153 [2024-07-15 12:12:43.898513] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624658 ] 00:31:33.153 Using job config with 3 jobs 00:31:33.153 [2024-07-15 12:12:44.039405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:33.153 [2024-07-15 12:12:44.159453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:33.153 cpumask for '\''job0'\'' is too big 00:31:33.153 cpumask for '\''job1'\'' is too big 00:31:33.154 cpumask for '\''job2'\'' is too big 00:31:33.154 Running I/O for 2 seconds... 00:31:33.154 00:31:33.154 Latency(us) 00:31:33.154 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:33.154 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:33.154 Malloc0 : 2.01 32624.61 31.86 0.00 0.00 7832.30 1802.24 11511.54 00:31:33.154 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:33.154 Malloc0 : 2.02 32636.49 31.87 0.00 0.00 7811.53 1773.75 9687.93 00:31:33.154 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:33.154 Malloc0 : 2.02 32606.73 31.84 0.00 0.00 7801.70 1780.87 8206.25 00:31:33.154 =================================================================================================================== 00:31:33.154 Total : 97867.84 95.57 0.00 0.00 7815.15 1773.75 11511.54' 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 12:12:43.898443] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:33.154 [2024-07-15 12:12:43.898513] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1624658 ] 00:31:33.154 Using job config with 3 jobs 00:31:33.154 [2024-07-15 12:12:44.039405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:33.154 [2024-07-15 12:12:44.159453] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:33.154 cpumask for '\''job0'\'' is too big 00:31:33.154 cpumask for '\''job1'\'' is too big 00:31:33.154 cpumask for '\''job2'\'' is too big 00:31:33.154 Running I/O for 2 seconds... 00:31:33.154 00:31:33.154 Latency(us) 00:31:33.154 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:33.154 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:33.154 Malloc0 : 2.01 32624.61 31.86 0.00 0.00 7832.30 1802.24 11511.54 00:31:33.154 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:33.154 Malloc0 : 2.02 32636.49 31.87 0.00 0.00 7811.53 1773.75 9687.93 00:31:33.154 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:31:33.154 Malloc0 : 2.02 32606.73 31.84 0.00 0.00 7801.70 1780.87 8206.25 00:31:33.154 =================================================================================================================== 00:31:33.154 Total : 97867.84 95.57 0.00 0.00 7815.15 1773.75 11511.54' 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:33.154 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:33.154 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:33.154 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:33.154 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:31:33.154 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:31:33.154 12:12:46 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:36.441 12:12:49 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-15 12:12:46.667939] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:36.441 [2024-07-15 12:12:46.668008] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625014 ] 00:31:36.441 Using job config with 4 jobs 00:31:36.441 [2024-07-15 12:12:46.810033] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:36.441 [2024-07-15 12:12:46.923128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:36.441 cpumask for '\''job0'\'' is too big 00:31:36.441 cpumask for '\''job1'\'' is too big 00:31:36.441 cpumask for '\''job2'\'' is too big 00:31:36.441 cpumask for '\''job3'\'' is too big 00:31:36.441 Running I/O for 2 seconds... 00:31:36.441 00:31:36.441 Latency(us) 00:31:36.441 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:36.441 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc0 : 2.04 12060.98 11.78 0.00 0.00 21204.90 3789.69 32824.99 00:31:36.441 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc1 : 2.04 12049.78 11.77 0.00 0.00 21204.91 4587.52 32824.99 00:31:36.441 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc0 : 2.04 12038.94 11.76 0.00 0.00 21148.18 3732.70 28949.82 00:31:36.441 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc1 : 2.04 12027.82 11.75 0.00 0.00 21146.82 4587.52 28949.82 00:31:36.441 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc0 : 2.05 12017.01 11.74 0.00 0.00 21091.36 3732.70 25188.62 00:31:36.441 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc1 : 2.05 12005.94 11.72 0.00 0.00 21091.09 4559.03 25302.59 00:31:36.441 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc0 : 2.05 11995.16 11.71 0.00 0.00 21035.23 3732.70 21655.37 00:31:36.441 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc1 : 2.05 11984.17 11.70 0.00 0.00 21034.99 4587.52 21655.37 00:31:36.441 =================================================================================================================== 00:31:36.441 Total : 96179.79 93.93 0.00 0.00 21119.68 3732.70 32824.99' 00:31:36.441 12:12:49 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-15 12:12:46.667939] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:36.441 [2024-07-15 12:12:46.668008] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625014 ] 00:31:36.441 Using job config with 4 jobs 00:31:36.441 [2024-07-15 12:12:46.810033] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:36.441 [2024-07-15 12:12:46.923128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:36.441 cpumask for '\''job0'\'' is too big 00:31:36.441 cpumask for '\''job1'\'' is too big 00:31:36.441 cpumask for '\''job2'\'' is too big 00:31:36.441 cpumask for '\''job3'\'' is too big 00:31:36.441 Running I/O for 2 seconds... 00:31:36.441 00:31:36.441 Latency(us) 00:31:36.441 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:36.441 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc0 : 2.04 12060.98 11.78 0.00 0.00 21204.90 3789.69 32824.99 00:31:36.441 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc1 : 2.04 12049.78 11.77 0.00 0.00 21204.91 4587.52 32824.99 00:31:36.441 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc0 : 2.04 12038.94 11.76 0.00 0.00 21148.18 3732.70 28949.82 00:31:36.441 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc1 : 2.04 12027.82 11.75 0.00 0.00 21146.82 4587.52 28949.82 00:31:36.441 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc0 : 2.05 12017.01 11.74 0.00 0.00 21091.36 3732.70 25188.62 00:31:36.441 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc1 : 2.05 12005.94 11.72 0.00 0.00 21091.09 4559.03 25302.59 00:31:36.441 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc0 : 2.05 11995.16 11.71 0.00 0.00 21035.23 3732.70 21655.37 00:31:36.441 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc1 : 2.05 11984.17 11.70 0.00 0.00 21034.99 4587.52 21655.37 00:31:36.441 =================================================================================================================== 00:31:36.441 Total : 96179.79 93.93 0.00 0.00 21119.68 3732.70 32824.99' 00:31:36.441 12:12:49 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-15 12:12:46.667939] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:36.441 [2024-07-15 12:12:46.668008] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625014 ] 00:31:36.441 Using job config with 4 jobs 00:31:36.441 [2024-07-15 12:12:46.810033] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:36.441 [2024-07-15 12:12:46.923128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:36.441 cpumask for '\''job0'\'' is too big 00:31:36.441 cpumask for '\''job1'\'' is too big 00:31:36.441 cpumask for '\''job2'\'' is too big 00:31:36.441 cpumask for '\''job3'\'' is too big 00:31:36.441 Running I/O for 2 seconds... 00:31:36.441 00:31:36.441 Latency(us) 00:31:36.441 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:36.441 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc0 : 2.04 12060.98 11.78 0.00 0.00 21204.90 3789.69 32824.99 00:31:36.441 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.441 Malloc1 : 2.04 12049.78 11.77 0.00 0.00 21204.91 4587.52 32824.99 00:31:36.442 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.442 Malloc0 : 2.04 12038.94 11.76 0.00 0.00 21148.18 3732.70 28949.82 00:31:36.442 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.442 Malloc1 : 2.04 12027.82 11.75 0.00 0.00 21146.82 4587.52 28949.82 00:31:36.442 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.442 Malloc0 : 2.05 12017.01 11.74 0.00 0.00 21091.36 3732.70 25188.62 00:31:36.442 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.442 Malloc1 : 2.05 12005.94 11.72 0.00 0.00 21091.09 4559.03 25302.59 00:31:36.442 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.442 Malloc0 : 2.05 11995.16 11.71 0.00 0.00 21035.23 3732.70 21655.37 00:31:36.442 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:31:36.442 Malloc1 : 2.05 11984.17 11.70 0.00 0.00 21034.99 4587.52 21655.37 00:31:36.442 =================================================================================================================== 00:31:36.442 Total : 96179.79 93.93 0.00 0.00 21119.68 3732.70 32824.99' 00:31:36.442 12:12:49 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:31:36.442 12:12:49 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:31:36.442 12:12:49 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:31:36.442 12:12:49 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:31:36.442 12:12:49 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:31:36.442 12:12:49 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:31:36.442 00:31:36.442 real 0m11.264s 00:31:36.442 user 0m9.929s 00:31:36.442 sys 0m1.180s 00:31:36.442 12:12:49 bdevperf_config -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:36.442 12:12:49 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:31:36.442 ************************************ 00:31:36.442 END TEST bdevperf_config 00:31:36.442 ************************************ 00:31:36.442 12:12:49 -- common/autotest_common.sh@1142 -- # return 0 00:31:36.442 12:12:49 -- spdk/autotest.sh@192 -- # uname -s 00:31:36.442 12:12:49 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:31:36.442 12:12:49 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:31:36.442 12:12:49 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:31:36.442 12:12:49 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:36.442 12:12:49 -- common/autotest_common.sh@10 -- # set +x 00:31:36.442 ************************************ 00:31:36.442 START TEST reactor_set_interrupt 00:31:36.442 ************************************ 00:31:36.442 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:31:36.442 * Looking for test storage... 00:31:36.442 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:36.442 12:12:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:31:36.442 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:31:36.442 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:36.442 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:36.442 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:31:36.442 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:36.442 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:31:36.442 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:31:36.442 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:31:36.442 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:31:36.442 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:31:36.442 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:31:36.442 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:31:36.442 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:31:36.442 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:31:36.442 12:12:49 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:31:36.443 12:12:49 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:31:36.443 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:31:36.443 #define SPDK_CONFIG_H 00:31:36.443 #define SPDK_CONFIG_APPS 1 00:31:36.443 #define SPDK_CONFIG_ARCH native 00:31:36.443 #undef SPDK_CONFIG_ASAN 00:31:36.443 #undef SPDK_CONFIG_AVAHI 00:31:36.443 #undef SPDK_CONFIG_CET 00:31:36.443 #define SPDK_CONFIG_COVERAGE 1 00:31:36.443 #define SPDK_CONFIG_CROSS_PREFIX 00:31:36.443 #define SPDK_CONFIG_CRYPTO 1 00:31:36.443 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:31:36.443 #undef SPDK_CONFIG_CUSTOMOCF 00:31:36.443 #undef SPDK_CONFIG_DAOS 00:31:36.443 #define SPDK_CONFIG_DAOS_DIR 00:31:36.443 #define SPDK_CONFIG_DEBUG 1 00:31:36.443 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:31:36.443 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:36.443 #define SPDK_CONFIG_DPDK_INC_DIR 00:31:36.443 #define SPDK_CONFIG_DPDK_LIB_DIR 00:31:36.443 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:31:36.443 #undef SPDK_CONFIG_DPDK_UADK 00:31:36.443 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:36.443 #define SPDK_CONFIG_EXAMPLES 1 00:31:36.443 #undef SPDK_CONFIG_FC 00:31:36.443 #define SPDK_CONFIG_FC_PATH 00:31:36.443 #define SPDK_CONFIG_FIO_PLUGIN 1 00:31:36.443 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:31:36.443 #undef SPDK_CONFIG_FUSE 00:31:36.443 #undef SPDK_CONFIG_FUZZER 00:31:36.443 #define SPDK_CONFIG_FUZZER_LIB 00:31:36.443 #undef SPDK_CONFIG_GOLANG 00:31:36.443 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:31:36.443 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:31:36.443 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:31:36.443 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:31:36.443 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:31:36.443 #undef SPDK_CONFIG_HAVE_LIBBSD 00:31:36.443 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:31:36.443 #define SPDK_CONFIG_IDXD 1 00:31:36.443 #define SPDK_CONFIG_IDXD_KERNEL 1 00:31:36.443 #define SPDK_CONFIG_IPSEC_MB 1 00:31:36.443 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:36.443 #define SPDK_CONFIG_ISAL 1 00:31:36.443 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:31:36.443 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:31:36.443 #define SPDK_CONFIG_LIBDIR 00:31:36.443 #undef SPDK_CONFIG_LTO 00:31:36.443 #define SPDK_CONFIG_MAX_LCORES 128 00:31:36.443 #define SPDK_CONFIG_NVME_CUSE 1 00:31:36.443 #undef SPDK_CONFIG_OCF 00:31:36.443 #define SPDK_CONFIG_OCF_PATH 00:31:36.443 #define SPDK_CONFIG_OPENSSL_PATH 00:31:36.443 #undef SPDK_CONFIG_PGO_CAPTURE 00:31:36.443 #define SPDK_CONFIG_PGO_DIR 00:31:36.443 #undef SPDK_CONFIG_PGO_USE 00:31:36.443 #define SPDK_CONFIG_PREFIX /usr/local 00:31:36.443 #undef SPDK_CONFIG_RAID5F 00:31:36.443 #undef SPDK_CONFIG_RBD 00:31:36.443 #define SPDK_CONFIG_RDMA 1 00:31:36.443 #define SPDK_CONFIG_RDMA_PROV verbs 00:31:36.443 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:31:36.443 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:31:36.443 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:31:36.443 #define SPDK_CONFIG_SHARED 1 00:31:36.443 #undef SPDK_CONFIG_SMA 00:31:36.443 #define SPDK_CONFIG_TESTS 1 00:31:36.443 #undef SPDK_CONFIG_TSAN 00:31:36.443 #define SPDK_CONFIG_UBLK 1 00:31:36.443 #define SPDK_CONFIG_UBSAN 1 00:31:36.443 #undef SPDK_CONFIG_UNIT_TESTS 00:31:36.443 #undef SPDK_CONFIG_URING 00:31:36.443 #define SPDK_CONFIG_URING_PATH 00:31:36.443 #undef SPDK_CONFIG_URING_ZNS 00:31:36.443 #undef SPDK_CONFIG_USDT 00:31:36.443 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:31:36.443 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:31:36.443 #undef SPDK_CONFIG_VFIO_USER 00:31:36.443 #define SPDK_CONFIG_VFIO_USER_DIR 00:31:36.443 #define SPDK_CONFIG_VHOST 1 00:31:36.443 #define SPDK_CONFIG_VIRTIO 1 00:31:36.443 #undef SPDK_CONFIG_VTUNE 00:31:36.443 #define SPDK_CONFIG_VTUNE_DIR 00:31:36.443 #define SPDK_CONFIG_WERROR 1 00:31:36.443 #define SPDK_CONFIG_WPDK_DIR 00:31:36.443 #undef SPDK_CONFIG_XNVME 00:31:36.443 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:31:36.443 12:12:49 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:31:36.443 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:36.443 12:12:49 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:36.443 12:12:49 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:36.443 12:12:49 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:36.443 12:12:49 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:36.443 12:12:49 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:36.443 12:12:49 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:36.443 12:12:49 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:31:36.444 12:12:49 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:31:36.444 12:12:49 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:31:36.444 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:31:36.445 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 1625406 ]] 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 1625406 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.VM1KcM 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.VM1KcM/tests/interrupt /tmp/spdk.VM1KcM 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:31:36.446 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=955969536 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4328460288 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=84582699008 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508548096 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9925849088 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47249563648 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254274048 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=18892337152 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901712896 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9375744 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47253651456 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254274048 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=622592 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:31:36.447 * Looking for test storage... 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=84582699008 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:31:36.447 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=12140441600 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:36.448 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1625478 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1625478 /var/tmp/spdk.sock 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1625478 ']' 00:31:36.448 12:12:49 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:36.448 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:36.448 12:12:49 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:36.448 [2024-07-15 12:12:49.841277] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:36.448 [2024-07-15 12:12:49.841351] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1625478 ] 00:31:36.448 [2024-07-15 12:12:49.972646] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:36.707 [2024-07-15 12:12:50.085821] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:36.708 [2024-07-15 12:12:50.085922] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:36.708 [2024-07-15 12:12:50.085922] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:36.708 [2024-07-15 12:12:50.159381] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:37.275 12:12:50 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:37.275 12:12:50 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:31:37.275 12:12:50 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:31:37.275 12:12:50 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:37.534 Malloc0 00:31:37.534 Malloc1 00:31:37.534 Malloc2 00:31:37.534 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:31:37.534 12:12:51 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:31:37.535 12:12:51 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:37.535 12:12:51 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:37.794 5000+0 records in 00:31:37.794 5000+0 records out 00:31:37.794 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0275948 s, 371 MB/s 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:37.794 AIO0 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1625478 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1625478 without_thd 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1625478 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:37.794 12:12:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:38.055 12:12:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:31:38.055 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:31:38.055 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:31:38.055 12:12:51 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:31:38.055 12:12:51 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:38.055 12:12:51 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:31:38.055 12:12:51 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:38.055 12:12:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:38.055 12:12:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:38.314 12:12:51 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:31:38.314 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:31:38.314 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:31:38.314 spdk_thread ids are 1 on reactor0. 00:31:38.314 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:38.314 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1625478 0 00:31:38.314 12:12:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1625478 0 idle 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1625478 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1625478 -w 256 00:31:38.315 12:12:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1625478 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0' 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1625478 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.41 reactor_0 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1625478 1 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1625478 1 idle 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1625478 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1625478 -w 256 00:31:38.631 12:12:51 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1625527 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1625527 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1625478 2 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1625478 2 idle 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1625478 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1625478 -w 256 00:31:38.631 12:12:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1625528 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1625528 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:31:38.890 12:12:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:31:39.150 [2024-07-15 12:12:52.578887] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:39.150 12:12:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:31:39.409 [2024-07-15 12:12:52.826535] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:31:39.409 [2024-07-15 12:12:52.826836] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:39.409 12:12:52 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:31:39.668 [2024-07-15 12:12:53.070536] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:31:39.668 [2024-07-15 12:12:53.070733] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1625478 0 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1625478 0 busy 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1625478 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1625478 -w 256 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1625478 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.84 reactor_0' 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1625478 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.84 reactor_0 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1625478 2 00:31:39.669 12:12:53 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1625478 2 busy 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1625478 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1625478 -w 256 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1625528 root 20 0 128.2g 36864 23616 R 93.8 0.0 0:00.35 reactor_2' 00:31:39.927 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1625528 root 20 0 128.2g 36864 23616 R 93.8 0.0 0:00.35 reactor_2 00:31:39.928 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:39.928 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:39.928 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=93.8 00:31:39.928 12:12:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=93 00:31:39.928 12:12:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:39.928 12:12:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 93 -lt 70 ]] 00:31:39.928 12:12:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:39.928 12:12:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:39.928 12:12:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:31:40.187 [2024-07-15 12:12:53.602518] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:31:40.187 [2024-07-15 12:12:53.602645] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1625478 2 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1625478 2 idle 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1625478 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1625478 -w 256 00:31:40.187 12:12:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1625528 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.52 reactor_2' 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1625528 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.52 reactor_2 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:40.446 12:12:53 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:31:40.705 [2024-07-15 12:12:54.046517] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:31:40.705 [2024-07-15 12:12:54.046699] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:31:40.705 [2024-07-15 12:12:54.238819] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1625478 0 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1625478 0 idle 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1625478 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1625478 -w 256 00:31:40.705 12:12:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1625478 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.64 reactor_0' 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1625478 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.64 reactor_0 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:31:40.965 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1625478 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1625478 ']' 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1625478 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1625478 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1625478' 00:31:40.965 killing process with pid 1625478 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1625478 00:31:40.965 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1625478 00:31:41.224 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:31:41.224 12:12:54 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:41.224 12:12:54 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:31:41.224 12:12:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:41.224 12:12:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:41.224 12:12:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1626221 00:31:41.224 12:12:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:41.224 12:12:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:41.224 12:12:54 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1626221 /var/tmp/spdk.sock 00:31:41.224 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@829 -- # '[' -z 1626221 ']' 00:31:41.224 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:41.224 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:41.224 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:41.224 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:41.224 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:41.224 12:12:54 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:41.224 [2024-07-15 12:12:54.814150] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:41.224 [2024-07-15 12:12:54.814203] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626221 ] 00:31:41.483 [2024-07-15 12:12:54.926868] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:41.483 [2024-07-15 12:12:55.028431] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:41.483 [2024-07-15 12:12:55.028537] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:41.483 [2024-07-15 12:12:55.028537] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:41.742 [2024-07-15 12:12:55.099490] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:42.310 12:12:55 reactor_set_interrupt -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:42.310 12:12:55 reactor_set_interrupt -- common/autotest_common.sh@862 -- # return 0 00:31:42.310 12:12:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:31:42.310 12:12:55 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:42.310 Malloc0 00:31:42.310 Malloc1 00:31:42.310 Malloc2 00:31:42.310 12:12:55 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:31:42.310 12:12:55 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:31:42.310 12:12:55 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:42.310 12:12:55 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:42.310 5000+0 records in 00:31:42.310 5000+0 records out 00:31:42.310 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0271866 s, 377 MB/s 00:31:42.310 12:12:55 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:42.569 AIO0 00:31:42.569 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1626221 00:31:42.569 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1626221 00:31:42.569 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1626221 00:31:42.569 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:31:42.569 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:31:42.829 12:12:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:31:43.088 spdk_thread ids are 1 on reactor0. 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1626221 0 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1626221 0 idle 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1626221 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:43.088 12:12:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:43.089 12:12:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1626221 -w 256 00:31:43.089 12:12:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1626221 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0' 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1626221 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.38 reactor_0 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1626221 1 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1626221 1 idle 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1626221 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1626221 -w 256 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1626224 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1' 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1626224 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_1 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1626221 2 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1626221 2 idle 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1626221 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1626221 -w 256 00:31:43.348 12:12:56 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1626225 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2' 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1626225 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.00 reactor_2 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:31:43.607 12:12:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:31:43.866 [2024-07-15 12:12:57.321093] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:31:43.866 [2024-07-15 12:12:57.321251] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:31:43.866 [2024-07-15 12:12:57.321443] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:43.866 12:12:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:31:44.126 [2024-07-15 12:12:57.573627] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:31:44.126 [2024-07-15 12:12:57.573844] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1626221 0 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1626221 0 busy 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1626221 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1626221 -w 256 00:31:44.126 12:12:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1626221 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0' 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1626221 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.82 reactor_0 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1626221 2 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1626221 2 busy 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1626221 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1626221 -w 256 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1626225 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.35 reactor_2' 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1626225 root 20 0 128.2g 36864 23616 R 99.9 0.0 0:00.35 reactor_2 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:44.385 12:12:57 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:31:44.645 [2024-07-15 12:12:58.175352] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:31:44.645 [2024-07-15 12:12:58.175475] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1626221 2 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1626221 2 idle 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1626221 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1626221 -w 256 00:31:44.645 12:12:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1626225 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.59 reactor_2' 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1626225 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:00.59 reactor_2 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:44.905 12:12:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:31:45.165 [2024-07-15 12:12:58.608467] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:31:45.165 [2024-07-15 12:12:58.608701] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:31:45.165 [2024-07-15 12:12:58.608728] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1626221 0 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1626221 0 idle 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1626221 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1626221 -w 256 00:31:45.165 12:12:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1626221 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.67 reactor_0' 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1626221 root 20 0 128.2g 36864 23616 S 0.0 0.0 0:01.67 reactor_0 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:31:45.425 12:12:58 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1626221 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@948 -- # '[' -z 1626221 ']' 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@952 -- # kill -0 1626221 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@953 -- # uname 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1626221 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1626221' 00:31:45.425 killing process with pid 1626221 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@967 -- # kill 1626221 00:31:45.425 12:12:58 reactor_set_interrupt -- common/autotest_common.sh@972 -- # wait 1626221 00:31:45.685 12:12:59 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:31:45.685 12:12:59 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:45.685 00:31:45.685 real 0m9.645s 00:31:45.685 user 0m8.989s 00:31:45.685 sys 0m2.114s 00:31:45.685 12:12:59 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:45.685 12:12:59 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:31:45.685 ************************************ 00:31:45.685 END TEST reactor_set_interrupt 00:31:45.685 ************************************ 00:31:45.685 12:12:59 -- common/autotest_common.sh@1142 -- # return 0 00:31:45.685 12:12:59 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:45.685 12:12:59 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:31:45.685 12:12:59 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:45.685 12:12:59 -- common/autotest_common.sh@10 -- # set +x 00:31:45.685 ************************************ 00:31:45.685 START TEST reap_unregistered_poller 00:31:45.685 ************************************ 00:31:45.685 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:45.946 * Looking for test storage... 00:31:45.946 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:45.946 12:12:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:31:45.946 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:31:45.946 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:45.946 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:45.946 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:31:45.946 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:45.946 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:31:45.946 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:31:45.946 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:31:45.946 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:31:45.946 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:31:45.946 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:31:45.946 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:31:45.946 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:31:45.946 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:31:45.946 12:12:59 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:31:45.947 12:12:59 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:31:45.947 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:31:45.947 #define SPDK_CONFIG_H 00:31:45.947 #define SPDK_CONFIG_APPS 1 00:31:45.947 #define SPDK_CONFIG_ARCH native 00:31:45.947 #undef SPDK_CONFIG_ASAN 00:31:45.947 #undef SPDK_CONFIG_AVAHI 00:31:45.947 #undef SPDK_CONFIG_CET 00:31:45.947 #define SPDK_CONFIG_COVERAGE 1 00:31:45.947 #define SPDK_CONFIG_CROSS_PREFIX 00:31:45.947 #define SPDK_CONFIG_CRYPTO 1 00:31:45.947 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:31:45.947 #undef SPDK_CONFIG_CUSTOMOCF 00:31:45.947 #undef SPDK_CONFIG_DAOS 00:31:45.947 #define SPDK_CONFIG_DAOS_DIR 00:31:45.947 #define SPDK_CONFIG_DEBUG 1 00:31:45.947 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:31:45.947 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:31:45.947 #define SPDK_CONFIG_DPDK_INC_DIR 00:31:45.947 #define SPDK_CONFIG_DPDK_LIB_DIR 00:31:45.947 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:31:45.947 #undef SPDK_CONFIG_DPDK_UADK 00:31:45.947 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:31:45.947 #define SPDK_CONFIG_EXAMPLES 1 00:31:45.947 #undef SPDK_CONFIG_FC 00:31:45.947 #define SPDK_CONFIG_FC_PATH 00:31:45.947 #define SPDK_CONFIG_FIO_PLUGIN 1 00:31:45.947 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:31:45.947 #undef SPDK_CONFIG_FUSE 00:31:45.947 #undef SPDK_CONFIG_FUZZER 00:31:45.947 #define SPDK_CONFIG_FUZZER_LIB 00:31:45.947 #undef SPDK_CONFIG_GOLANG 00:31:45.947 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:31:45.947 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:31:45.947 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:31:45.947 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:31:45.947 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:31:45.947 #undef SPDK_CONFIG_HAVE_LIBBSD 00:31:45.947 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:31:45.947 #define SPDK_CONFIG_IDXD 1 00:31:45.947 #define SPDK_CONFIG_IDXD_KERNEL 1 00:31:45.947 #define SPDK_CONFIG_IPSEC_MB 1 00:31:45.947 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:31:45.947 #define SPDK_CONFIG_ISAL 1 00:31:45.947 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:31:45.947 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:31:45.947 #define SPDK_CONFIG_LIBDIR 00:31:45.947 #undef SPDK_CONFIG_LTO 00:31:45.947 #define SPDK_CONFIG_MAX_LCORES 128 00:31:45.947 #define SPDK_CONFIG_NVME_CUSE 1 00:31:45.947 #undef SPDK_CONFIG_OCF 00:31:45.947 #define SPDK_CONFIG_OCF_PATH 00:31:45.947 #define SPDK_CONFIG_OPENSSL_PATH 00:31:45.947 #undef SPDK_CONFIG_PGO_CAPTURE 00:31:45.947 #define SPDK_CONFIG_PGO_DIR 00:31:45.947 #undef SPDK_CONFIG_PGO_USE 00:31:45.947 #define SPDK_CONFIG_PREFIX /usr/local 00:31:45.947 #undef SPDK_CONFIG_RAID5F 00:31:45.947 #undef SPDK_CONFIG_RBD 00:31:45.947 #define SPDK_CONFIG_RDMA 1 00:31:45.947 #define SPDK_CONFIG_RDMA_PROV verbs 00:31:45.947 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:31:45.947 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:31:45.947 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:31:45.947 #define SPDK_CONFIG_SHARED 1 00:31:45.947 #undef SPDK_CONFIG_SMA 00:31:45.947 #define SPDK_CONFIG_TESTS 1 00:31:45.947 #undef SPDK_CONFIG_TSAN 00:31:45.947 #define SPDK_CONFIG_UBLK 1 00:31:45.947 #define SPDK_CONFIG_UBSAN 1 00:31:45.947 #undef SPDK_CONFIG_UNIT_TESTS 00:31:45.947 #undef SPDK_CONFIG_URING 00:31:45.947 #define SPDK_CONFIG_URING_PATH 00:31:45.947 #undef SPDK_CONFIG_URING_ZNS 00:31:45.947 #undef SPDK_CONFIG_USDT 00:31:45.947 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:31:45.947 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:31:45.947 #undef SPDK_CONFIG_VFIO_USER 00:31:45.947 #define SPDK_CONFIG_VFIO_USER_DIR 00:31:45.947 #define SPDK_CONFIG_VHOST 1 00:31:45.947 #define SPDK_CONFIG_VIRTIO 1 00:31:45.947 #undef SPDK_CONFIG_VTUNE 00:31:45.947 #define SPDK_CONFIG_VTUNE_DIR 00:31:45.947 #define SPDK_CONFIG_WERROR 1 00:31:45.947 #define SPDK_CONFIG_WPDK_DIR 00:31:45.947 #undef SPDK_CONFIG_XNVME 00:31:45.947 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:31:45.947 12:12:59 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:31:45.947 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:45.947 12:12:59 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:45.947 12:12:59 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:45.947 12:12:59 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:45.947 12:12:59 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:45.947 12:12:59 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:45.947 12:12:59 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:45.947 12:12:59 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:31:45.947 12:12:59 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:45.947 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:45.947 12:12:59 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:31:45.947 12:12:59 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:31:45.948 12:12:59 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:31:45.948 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j72 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 1626859 ]] 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 1626859 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.guE8Fo 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.guE8Fo/tests/interrupt /tmp/spdk.guE8Fo 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=955969536 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4328460288 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=84582547456 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=94508548096 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9926000640 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:45.949 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47249563648 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254274048 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4710400 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=18892337152 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=18901712896 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9375744 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47253651456 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47254274048 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=622592 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9450848256 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9450852352 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:31:45.950 * Looking for test storage... 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=84582547456 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=12140593152 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:45.950 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1626906 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:31:45.950 12:12:59 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1626906 /var/tmp/spdk.sock 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@829 -- # '[' -z 1626906 ']' 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:45.950 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:45.952 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:45.952 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:45.952 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:45.952 12:12:59 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:46.211 [2024-07-15 12:12:59.556697] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:46.211 [2024-07-15 12:12:59.556764] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1626906 ] 00:31:46.211 [2024-07-15 12:12:59.685594] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:46.211 [2024-07-15 12:12:59.791274] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:46.211 [2024-07-15 12:12:59.791312] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:46.211 [2024-07-15 12:12:59.791312] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:46.470 [2024-07-15 12:12:59.862196] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:31:47.040 12:13:00 reap_unregistered_poller -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:47.040 12:13:00 reap_unregistered_poller -- common/autotest_common.sh@862 -- # return 0 00:31:47.040 12:13:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:31:47.040 12:13:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:31:47.040 12:13:00 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:47.040 12:13:00 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:47.040 12:13:00 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:47.040 12:13:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:31:47.040 "name": "app_thread", 00:31:47.040 "id": 1, 00:31:47.040 "active_pollers": [], 00:31:47.040 "timed_pollers": [ 00:31:47.040 { 00:31:47.040 "name": "rpc_subsystem_poll_servers", 00:31:47.040 "id": 1, 00:31:47.040 "state": "waiting", 00:31:47.040 "run_count": 0, 00:31:47.040 "busy_count": 0, 00:31:47.040 "period_ticks": 9200000 00:31:47.040 } 00:31:47.040 ], 00:31:47.040 "paused_pollers": [] 00:31:47.040 }' 00:31:47.040 12:13:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:31:47.040 12:13:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:31:47.040 12:13:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:31:47.040 12:13:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:31:47.300 12:13:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:31:47.300 12:13:00 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:31:47.300 12:13:00 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:31:47.300 12:13:00 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:31:47.300 12:13:00 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:31:47.300 5000+0 records in 00:31:47.300 5000+0 records out 00:31:47.300 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0265266 s, 386 MB/s 00:31:47.300 12:13:00 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:31:47.559 AIO0 00:31:47.559 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:48.126 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:31:48.126 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:31:48.126 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:31:48.126 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@559 -- # xtrace_disable 00:31:48.126 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:48.126 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:31:48.126 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:31:48.126 "name": "app_thread", 00:31:48.126 "id": 1, 00:31:48.126 "active_pollers": [], 00:31:48.126 "timed_pollers": [ 00:31:48.126 { 00:31:48.126 "name": "rpc_subsystem_poll_servers", 00:31:48.126 "id": 1, 00:31:48.126 "state": "waiting", 00:31:48.126 "run_count": 0, 00:31:48.126 "busy_count": 0, 00:31:48.126 "period_ticks": 9200000 00:31:48.126 } 00:31:48.126 ], 00:31:48.126 "paused_pollers": [] 00:31:48.126 }' 00:31:48.126 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:31:48.126 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:31:48.126 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:31:48.126 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:31:48.385 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:31:48.385 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:31:48.385 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:31:48.385 12:13:01 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1626906 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@948 -- # '[' -z 1626906 ']' 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@952 -- # kill -0 1626906 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@953 -- # uname 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1626906 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1626906' 00:31:48.385 killing process with pid 1626906 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@967 -- # kill 1626906 00:31:48.385 12:13:01 reap_unregistered_poller -- common/autotest_common.sh@972 -- # wait 1626906 00:31:48.644 12:13:02 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:31:48.644 12:13:02 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:31:48.644 00:31:48.644 real 0m2.797s 00:31:48.645 user 0m1.822s 00:31:48.645 sys 0m0.726s 00:31:48.645 12:13:02 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # xtrace_disable 00:31:48.645 12:13:02 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:31:48.645 ************************************ 00:31:48.645 END TEST reap_unregistered_poller 00:31:48.645 ************************************ 00:31:48.645 12:13:02 -- common/autotest_common.sh@1142 -- # return 0 00:31:48.645 12:13:02 -- spdk/autotest.sh@198 -- # uname -s 00:31:48.645 12:13:02 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:31:48.645 12:13:02 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:31:48.645 12:13:02 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:31:48.645 12:13:02 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@260 -- # timing_exit lib 00:31:48.645 12:13:02 -- common/autotest_common.sh@728 -- # xtrace_disable 00:31:48.645 12:13:02 -- common/autotest_common.sh@10 -- # set +x 00:31:48.645 12:13:02 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:31:48.645 12:13:02 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:31:48.645 12:13:02 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:31:48.645 12:13:02 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:31:48.645 12:13:02 -- common/autotest_common.sh@10 -- # set +x 00:31:48.645 ************************************ 00:31:48.645 START TEST compress_compdev 00:31:48.645 ************************************ 00:31:48.645 12:13:02 compress_compdev -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:31:48.905 * Looking for test storage... 00:31:48.905 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:005d867c-174e-e711-906e-0012795d9712 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=005d867c-174e-e711-906e-0012795d9712 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:31:48.905 12:13:02 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:31:48.905 12:13:02 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:31:48.905 12:13:02 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:31:48.905 12:13:02 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:48.905 12:13:02 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:48.905 12:13:02 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:48.905 12:13:02 compress_compdev -- paths/export.sh@5 -- # export PATH 00:31:48.905 12:13:02 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:31:48.905 12:13:02 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1627348 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1627348 00:31:48.905 12:13:02 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1627348 ']' 00:31:48.905 12:13:02 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:31:48.905 12:13:02 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:48.905 12:13:02 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:31:48.905 12:13:02 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:48.905 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:48.905 12:13:02 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:31:48.905 12:13:02 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:31:48.905 [2024-07-15 12:13:02.382308] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:31:48.905 [2024-07-15 12:13:02.382380] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1627348 ] 00:31:49.165 [2024-07-15 12:13:02.516958] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:49.165 [2024-07-15 12:13:02.657321] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:49.165 [2024-07-15 12:13:02.657327] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:50.103 [2024-07-15 12:13:03.640968] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:31:50.362 12:13:03 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:31:50.362 12:13:03 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:31:50.362 12:13:03 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:31:50.362 12:13:03 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:31:50.362 12:13:03 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:31:53.652 [2024-07-15 12:13:06.775088] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x198a1e0 PMD being used: compress_qat 00:31:53.652 12:13:06 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:31:53.652 12:13:06 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:31:53.652 12:13:06 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:53.652 12:13:06 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:53.652 12:13:06 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:53.652 12:13:06 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:53.652 12:13:06 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:53.652 12:13:07 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:31:53.911 [ 00:31:53.911 { 00:31:53.911 "name": "Nvme0n1", 00:31:53.911 "aliases": [ 00:31:53.911 "ef69b114-3330-44d7-bd3f-f3f62cabe78f" 00:31:53.911 ], 00:31:53.911 "product_name": "NVMe disk", 00:31:53.911 "block_size": 512, 00:31:53.911 "num_blocks": 15628053168, 00:31:53.911 "uuid": "ef69b114-3330-44d7-bd3f-f3f62cabe78f", 00:31:53.911 "assigned_rate_limits": { 00:31:53.911 "rw_ios_per_sec": 0, 00:31:53.911 "rw_mbytes_per_sec": 0, 00:31:53.911 "r_mbytes_per_sec": 0, 00:31:53.911 "w_mbytes_per_sec": 0 00:31:53.911 }, 00:31:53.911 "claimed": false, 00:31:53.911 "zoned": false, 00:31:53.911 "supported_io_types": { 00:31:53.911 "read": true, 00:31:53.911 "write": true, 00:31:53.911 "unmap": true, 00:31:53.911 "flush": true, 00:31:53.911 "reset": true, 00:31:53.911 "nvme_admin": true, 00:31:53.911 "nvme_io": true, 00:31:53.911 "nvme_io_md": false, 00:31:53.911 "write_zeroes": true, 00:31:53.911 "zcopy": false, 00:31:53.911 "get_zone_info": false, 00:31:53.911 "zone_management": false, 00:31:53.911 "zone_append": false, 00:31:53.911 "compare": false, 00:31:53.911 "compare_and_write": false, 00:31:53.911 "abort": true, 00:31:53.911 "seek_hole": false, 00:31:53.911 "seek_data": false, 00:31:53.911 "copy": false, 00:31:53.911 "nvme_iov_md": false 00:31:53.911 }, 00:31:53.911 "driver_specific": { 00:31:53.911 "nvme": [ 00:31:53.911 { 00:31:53.911 "pci_address": "0000:5e:00.0", 00:31:53.911 "trid": { 00:31:53.911 "trtype": "PCIe", 00:31:53.911 "traddr": "0000:5e:00.0" 00:31:53.911 }, 00:31:53.911 "ctrlr_data": { 00:31:53.911 "cntlid": 0, 00:31:53.911 "vendor_id": "0x8086", 00:31:53.911 "model_number": "INTEL SSDPE2KX080T8", 00:31:53.911 "serial_number": "BTLJ817201BU8P0HGN", 00:31:53.911 "firmware_revision": "VDV10184", 00:31:53.911 "oacs": { 00:31:53.911 "security": 0, 00:31:53.911 "format": 1, 00:31:53.911 "firmware": 1, 00:31:53.911 "ns_manage": 1 00:31:53.911 }, 00:31:53.911 "multi_ctrlr": false, 00:31:53.911 "ana_reporting": false 00:31:53.911 }, 00:31:53.911 "vs": { 00:31:53.911 "nvme_version": "1.2" 00:31:53.911 }, 00:31:53.911 "ns_data": { 00:31:53.911 "id": 1, 00:31:53.911 "can_share": false 00:31:53.911 } 00:31:53.911 } 00:31:53.911 ], 00:31:53.911 "mp_policy": "active_passive" 00:31:53.911 } 00:31:53.911 } 00:31:53.911 ] 00:31:53.911 12:13:07 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:53.911 12:13:07 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:31:54.170 [2024-07-15 12:13:07.561358] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17d86c0 PMD being used: compress_qat 00:31:58.359 ba234eb7-c482-4d95-87f3-64d5183cd511 00:31:58.359 12:13:11 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:31:58.359 73167def-823f-40b3-b695-2d5893a0035e 00:31:58.359 12:13:11 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:31:58.359 12:13:11 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:31:58.359 12:13:11 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:58.359 12:13:11 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:58.359 12:13:11 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:58.359 12:13:11 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:58.359 12:13:11 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:58.359 12:13:11 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:31:58.359 [ 00:31:58.359 { 00:31:58.359 "name": "73167def-823f-40b3-b695-2d5893a0035e", 00:31:58.359 "aliases": [ 00:31:58.359 "lvs0/lv0" 00:31:58.359 ], 00:31:58.359 "product_name": "Logical Volume", 00:31:58.359 "block_size": 512, 00:31:58.359 "num_blocks": 204800, 00:31:58.360 "uuid": "73167def-823f-40b3-b695-2d5893a0035e", 00:31:58.360 "assigned_rate_limits": { 00:31:58.360 "rw_ios_per_sec": 0, 00:31:58.360 "rw_mbytes_per_sec": 0, 00:31:58.360 "r_mbytes_per_sec": 0, 00:31:58.360 "w_mbytes_per_sec": 0 00:31:58.360 }, 00:31:58.360 "claimed": false, 00:31:58.360 "zoned": false, 00:31:58.360 "supported_io_types": { 00:31:58.360 "read": true, 00:31:58.360 "write": true, 00:31:58.360 "unmap": true, 00:31:58.360 "flush": false, 00:31:58.360 "reset": true, 00:31:58.360 "nvme_admin": false, 00:31:58.360 "nvme_io": false, 00:31:58.360 "nvme_io_md": false, 00:31:58.360 "write_zeroes": true, 00:31:58.360 "zcopy": false, 00:31:58.360 "get_zone_info": false, 00:31:58.360 "zone_management": false, 00:31:58.360 "zone_append": false, 00:31:58.360 "compare": false, 00:31:58.360 "compare_and_write": false, 00:31:58.360 "abort": false, 00:31:58.360 "seek_hole": true, 00:31:58.360 "seek_data": true, 00:31:58.360 "copy": false, 00:31:58.360 "nvme_iov_md": false 00:31:58.360 }, 00:31:58.360 "driver_specific": { 00:31:58.360 "lvol": { 00:31:58.360 "lvol_store_uuid": "ba234eb7-c482-4d95-87f3-64d5183cd511", 00:31:58.360 "base_bdev": "Nvme0n1", 00:31:58.360 "thin_provision": true, 00:31:58.360 "num_allocated_clusters": 0, 00:31:58.360 "snapshot": false, 00:31:58.360 "clone": false, 00:31:58.360 "esnap_clone": false 00:31:58.360 } 00:31:58.360 } 00:31:58.360 } 00:31:58.360 ] 00:31:58.360 12:13:11 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:58.360 12:13:11 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:31:58.360 12:13:11 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:31:58.619 [2024-07-15 12:13:12.176546] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:31:58.619 COMP_lvs0/lv0 00:31:58.619 12:13:12 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:31:58.619 12:13:12 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:31:58.619 12:13:12 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:31:58.619 12:13:12 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:31:58.619 12:13:12 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:31:58.619 12:13:12 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:31:58.619 12:13:12 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:31:58.878 12:13:12 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:31:59.136 [ 00:31:59.136 { 00:31:59.136 "name": "COMP_lvs0/lv0", 00:31:59.136 "aliases": [ 00:31:59.136 "9a7528a7-d7f6-524b-b432-fd5705e0fb88" 00:31:59.136 ], 00:31:59.136 "product_name": "compress", 00:31:59.136 "block_size": 512, 00:31:59.136 "num_blocks": 200704, 00:31:59.136 "uuid": "9a7528a7-d7f6-524b-b432-fd5705e0fb88", 00:31:59.136 "assigned_rate_limits": { 00:31:59.136 "rw_ios_per_sec": 0, 00:31:59.136 "rw_mbytes_per_sec": 0, 00:31:59.136 "r_mbytes_per_sec": 0, 00:31:59.136 "w_mbytes_per_sec": 0 00:31:59.136 }, 00:31:59.136 "claimed": false, 00:31:59.136 "zoned": false, 00:31:59.136 "supported_io_types": { 00:31:59.136 "read": true, 00:31:59.136 "write": true, 00:31:59.136 "unmap": false, 00:31:59.136 "flush": false, 00:31:59.136 "reset": false, 00:31:59.136 "nvme_admin": false, 00:31:59.136 "nvme_io": false, 00:31:59.136 "nvme_io_md": false, 00:31:59.136 "write_zeroes": true, 00:31:59.136 "zcopy": false, 00:31:59.136 "get_zone_info": false, 00:31:59.136 "zone_management": false, 00:31:59.136 "zone_append": false, 00:31:59.136 "compare": false, 00:31:59.136 "compare_and_write": false, 00:31:59.136 "abort": false, 00:31:59.136 "seek_hole": false, 00:31:59.136 "seek_data": false, 00:31:59.136 "copy": false, 00:31:59.136 "nvme_iov_md": false 00:31:59.136 }, 00:31:59.136 "driver_specific": { 00:31:59.136 "compress": { 00:31:59.136 "name": "COMP_lvs0/lv0", 00:31:59.136 "base_bdev_name": "73167def-823f-40b3-b695-2d5893a0035e" 00:31:59.136 } 00:31:59.136 } 00:31:59.136 } 00:31:59.136 ] 00:31:59.136 12:13:12 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:31:59.136 12:13:12 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:59.396 [2024-07-15 12:13:12.783387] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f8abc1b15c0 PMD being used: compress_qat 00:31:59.396 [2024-07-15 12:13:12.786769] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b7cd40 PMD being used: compress_qat 00:31:59.396 Running I/O for 3 seconds... 00:32:02.680 00:32:02.680 Latency(us) 00:32:02.680 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:02.680 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:02.680 Verification LBA range: start 0x0 length 0x3100 00:32:02.680 COMP_lvs0/lv0 : 3.01 1663.06 6.50 0.00 0.00 19134.61 272.47 23251.03 00:32:02.680 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:02.680 Verification LBA range: start 0x3100 length 0x3100 00:32:02.680 COMP_lvs0/lv0 : 3.01 1739.12 6.79 0.00 0.00 18284.11 315.21 22225.25 00:32:02.680 =================================================================================================================== 00:32:02.680 Total : 3402.18 13.29 0.00 0.00 18700.02 272.47 23251.03 00:32:02.680 0 00:32:02.680 12:13:15 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:32:02.680 12:13:15 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:02.680 12:13:16 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:02.940 12:13:16 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:02.940 12:13:16 compress_compdev -- compress/compress.sh@78 -- # killprocess 1627348 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1627348 ']' 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1627348 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1627348 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1627348' 00:32:02.940 killing process with pid 1627348 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@967 -- # kill 1627348 00:32:02.940 Received shutdown signal, test time was about 3.000000 seconds 00:32:02.940 00:32:02.940 Latency(us) 00:32:02.940 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:02.940 =================================================================================================================== 00:32:02.940 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:02.940 12:13:16 compress_compdev -- common/autotest_common.sh@972 -- # wait 1627348 00:32:11.059 12:13:23 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:32:11.059 12:13:23 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:32:11.059 12:13:23 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1630036 00:32:11.059 12:13:23 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:11.059 12:13:23 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1630036 00:32:11.059 12:13:23 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:32:11.059 12:13:23 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1630036 ']' 00:32:11.059 12:13:23 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:11.059 12:13:23 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:11.059 12:13:23 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:11.059 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:11.059 12:13:23 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:11.059 12:13:23 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:11.059 [2024-07-15 12:13:23.489477] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:32:11.059 [2024-07-15 12:13:23.489550] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1630036 ] 00:32:11.059 [2024-07-15 12:13:23.623758] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:11.059 [2024-07-15 12:13:23.742466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:11.059 [2024-07-15 12:13:23.742470] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:11.319 [2024-07-15 12:13:24.709788] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:32:11.319 12:13:24 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:11.319 12:13:24 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:32:11.319 12:13:24 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:32:11.319 12:13:24 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:11.319 12:13:24 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:14.674 [2024-07-15 12:13:27.902955] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x131a1e0 PMD being used: compress_qat 00:32:14.674 12:13:27 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:14.674 12:13:27 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:14.674 12:13:27 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:14.674 12:13:27 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:14.674 12:13:27 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:14.674 12:13:27 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:14.674 12:13:27 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:14.933 12:13:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:15.191 [ 00:32:15.191 { 00:32:15.191 "name": "Nvme0n1", 00:32:15.191 "aliases": [ 00:32:15.191 "30742826-0652-4491-a9f1-baec68720791" 00:32:15.191 ], 00:32:15.191 "product_name": "NVMe disk", 00:32:15.191 "block_size": 512, 00:32:15.191 "num_blocks": 15628053168, 00:32:15.191 "uuid": "30742826-0652-4491-a9f1-baec68720791", 00:32:15.191 "assigned_rate_limits": { 00:32:15.191 "rw_ios_per_sec": 0, 00:32:15.191 "rw_mbytes_per_sec": 0, 00:32:15.191 "r_mbytes_per_sec": 0, 00:32:15.191 "w_mbytes_per_sec": 0 00:32:15.191 }, 00:32:15.191 "claimed": false, 00:32:15.191 "zoned": false, 00:32:15.191 "supported_io_types": { 00:32:15.191 "read": true, 00:32:15.191 "write": true, 00:32:15.191 "unmap": true, 00:32:15.191 "flush": true, 00:32:15.191 "reset": true, 00:32:15.191 "nvme_admin": true, 00:32:15.191 "nvme_io": true, 00:32:15.191 "nvme_io_md": false, 00:32:15.191 "write_zeroes": true, 00:32:15.191 "zcopy": false, 00:32:15.191 "get_zone_info": false, 00:32:15.191 "zone_management": false, 00:32:15.191 "zone_append": false, 00:32:15.191 "compare": false, 00:32:15.191 "compare_and_write": false, 00:32:15.191 "abort": true, 00:32:15.191 "seek_hole": false, 00:32:15.191 "seek_data": false, 00:32:15.191 "copy": false, 00:32:15.191 "nvme_iov_md": false 00:32:15.191 }, 00:32:15.192 "driver_specific": { 00:32:15.192 "nvme": [ 00:32:15.192 { 00:32:15.192 "pci_address": "0000:5e:00.0", 00:32:15.192 "trid": { 00:32:15.192 "trtype": "PCIe", 00:32:15.192 "traddr": "0000:5e:00.0" 00:32:15.192 }, 00:32:15.192 "ctrlr_data": { 00:32:15.192 "cntlid": 0, 00:32:15.192 "vendor_id": "0x8086", 00:32:15.192 "model_number": "INTEL SSDPE2KX080T8", 00:32:15.192 "serial_number": "BTLJ817201BU8P0HGN", 00:32:15.192 "firmware_revision": "VDV10184", 00:32:15.192 "oacs": { 00:32:15.192 "security": 0, 00:32:15.192 "format": 1, 00:32:15.192 "firmware": 1, 00:32:15.192 "ns_manage": 1 00:32:15.192 }, 00:32:15.192 "multi_ctrlr": false, 00:32:15.192 "ana_reporting": false 00:32:15.192 }, 00:32:15.192 "vs": { 00:32:15.192 "nvme_version": "1.2" 00:32:15.192 }, 00:32:15.192 "ns_data": { 00:32:15.192 "id": 1, 00:32:15.192 "can_share": false 00:32:15.192 } 00:32:15.192 } 00:32:15.192 ], 00:32:15.192 "mp_policy": "active_passive" 00:32:15.192 } 00:32:15.192 } 00:32:15.192 ] 00:32:15.192 12:13:28 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:15.192 12:13:28 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:15.451 [2024-07-15 12:13:28.935918] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x11686c0 PMD being used: compress_qat 00:32:19.647 63fcde8d-0ec4-4964-a540-deda1fd0cb80 00:32:19.647 12:13:32 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:19.647 5c68a1fa-f093-48c0-8d4c-5ae69f4e7f47 00:32:19.647 12:13:32 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:19.647 12:13:32 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:19.647 12:13:32 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:19.647 12:13:32 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:19.647 12:13:32 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:19.647 12:13:32 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:19.647 12:13:32 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:19.647 12:13:33 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:19.906 [ 00:32:19.906 { 00:32:19.906 "name": "5c68a1fa-f093-48c0-8d4c-5ae69f4e7f47", 00:32:19.906 "aliases": [ 00:32:19.906 "lvs0/lv0" 00:32:19.906 ], 00:32:19.906 "product_name": "Logical Volume", 00:32:19.906 "block_size": 512, 00:32:19.906 "num_blocks": 204800, 00:32:19.906 "uuid": "5c68a1fa-f093-48c0-8d4c-5ae69f4e7f47", 00:32:19.906 "assigned_rate_limits": { 00:32:19.906 "rw_ios_per_sec": 0, 00:32:19.906 "rw_mbytes_per_sec": 0, 00:32:19.906 "r_mbytes_per_sec": 0, 00:32:19.906 "w_mbytes_per_sec": 0 00:32:19.906 }, 00:32:19.906 "claimed": false, 00:32:19.906 "zoned": false, 00:32:19.906 "supported_io_types": { 00:32:19.906 "read": true, 00:32:19.906 "write": true, 00:32:19.906 "unmap": true, 00:32:19.906 "flush": false, 00:32:19.906 "reset": true, 00:32:19.906 "nvme_admin": false, 00:32:19.906 "nvme_io": false, 00:32:19.906 "nvme_io_md": false, 00:32:19.906 "write_zeroes": true, 00:32:19.906 "zcopy": false, 00:32:19.906 "get_zone_info": false, 00:32:19.906 "zone_management": false, 00:32:19.906 "zone_append": false, 00:32:19.906 "compare": false, 00:32:19.906 "compare_and_write": false, 00:32:19.906 "abort": false, 00:32:19.906 "seek_hole": true, 00:32:19.906 "seek_data": true, 00:32:19.906 "copy": false, 00:32:19.906 "nvme_iov_md": false 00:32:19.906 }, 00:32:19.906 "driver_specific": { 00:32:19.906 "lvol": { 00:32:19.906 "lvol_store_uuid": "63fcde8d-0ec4-4964-a540-deda1fd0cb80", 00:32:19.906 "base_bdev": "Nvme0n1", 00:32:19.906 "thin_provision": true, 00:32:19.906 "num_allocated_clusters": 0, 00:32:19.906 "snapshot": false, 00:32:19.906 "clone": false, 00:32:19.906 "esnap_clone": false 00:32:19.906 } 00:32:19.906 } 00:32:19.906 } 00:32:19.906 ] 00:32:19.906 12:13:33 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:19.906 12:13:33 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:32:19.906 12:13:33 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:32:20.164 [2024-07-15 12:13:33.598211] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:20.164 COMP_lvs0/lv0 00:32:20.164 12:13:33 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:20.164 12:13:33 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:20.164 12:13:33 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:20.164 12:13:33 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:20.164 12:13:33 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:20.164 12:13:33 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:20.164 12:13:33 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:20.422 12:13:33 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:20.681 [ 00:32:20.681 { 00:32:20.681 "name": "COMP_lvs0/lv0", 00:32:20.681 "aliases": [ 00:32:20.681 "ed4f60c1-421e-5407-a4c2-ded24de80231" 00:32:20.681 ], 00:32:20.681 "product_name": "compress", 00:32:20.681 "block_size": 512, 00:32:20.681 "num_blocks": 200704, 00:32:20.681 "uuid": "ed4f60c1-421e-5407-a4c2-ded24de80231", 00:32:20.681 "assigned_rate_limits": { 00:32:20.681 "rw_ios_per_sec": 0, 00:32:20.681 "rw_mbytes_per_sec": 0, 00:32:20.681 "r_mbytes_per_sec": 0, 00:32:20.681 "w_mbytes_per_sec": 0 00:32:20.681 }, 00:32:20.681 "claimed": false, 00:32:20.681 "zoned": false, 00:32:20.681 "supported_io_types": { 00:32:20.681 "read": true, 00:32:20.681 "write": true, 00:32:20.681 "unmap": false, 00:32:20.681 "flush": false, 00:32:20.681 "reset": false, 00:32:20.681 "nvme_admin": false, 00:32:20.681 "nvme_io": false, 00:32:20.681 "nvme_io_md": false, 00:32:20.681 "write_zeroes": true, 00:32:20.681 "zcopy": false, 00:32:20.681 "get_zone_info": false, 00:32:20.681 "zone_management": false, 00:32:20.681 "zone_append": false, 00:32:20.681 "compare": false, 00:32:20.681 "compare_and_write": false, 00:32:20.681 "abort": false, 00:32:20.681 "seek_hole": false, 00:32:20.681 "seek_data": false, 00:32:20.681 "copy": false, 00:32:20.681 "nvme_iov_md": false 00:32:20.681 }, 00:32:20.681 "driver_specific": { 00:32:20.681 "compress": { 00:32:20.681 "name": "COMP_lvs0/lv0", 00:32:20.681 "base_bdev_name": "5c68a1fa-f093-48c0-8d4c-5ae69f4e7f47" 00:32:20.681 } 00:32:20.681 } 00:32:20.681 } 00:32:20.681 ] 00:32:20.681 12:13:34 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:20.681 12:13:34 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:20.681 [2024-07-15 12:13:34.189203] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f12a81b15c0 PMD being used: compress_qat 00:32:20.681 [2024-07-15 12:13:34.192585] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x150cd40 PMD being used: compress_qat 00:32:20.681 Running I/O for 3 seconds... 00:32:23.977 00:32:23.977 Latency(us) 00:32:23.977 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:23.977 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:23.977 Verification LBA range: start 0x0 length 0x3100 00:32:23.977 COMP_lvs0/lv0 : 3.01 1667.90 6.52 0.00 0.00 19094.24 336.58 22681.15 00:32:23.977 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:23.977 Verification LBA range: start 0x3100 length 0x3100 00:32:23.977 COMP_lvs0/lv0 : 3.01 1743.61 6.81 0.00 0.00 18232.56 274.25 20287.67 00:32:23.977 =================================================================================================================== 00:32:23.977 Total : 3411.51 13.33 0.00 0.00 18653.84 274.25 22681.15 00:32:23.977 0 00:32:23.977 12:13:37 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:32:23.977 12:13:37 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:23.978 12:13:37 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:24.238 12:13:37 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:24.238 12:13:37 compress_compdev -- compress/compress.sh@78 -- # killprocess 1630036 00:32:24.238 12:13:37 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1630036 ']' 00:32:24.238 12:13:37 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1630036 00:32:24.238 12:13:37 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:32:24.238 12:13:37 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:24.238 12:13:37 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1630036 00:32:24.238 12:13:37 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:24.238 12:13:37 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:24.238 12:13:37 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1630036' 00:32:24.238 killing process with pid 1630036 00:32:24.238 12:13:37 compress_compdev -- common/autotest_common.sh@967 -- # kill 1630036 00:32:24.238 Received shutdown signal, test time was about 3.000000 seconds 00:32:24.238 00:32:24.238 Latency(us) 00:32:24.238 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:24.238 =================================================================================================================== 00:32:24.239 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:24.239 12:13:37 compress_compdev -- common/autotest_common.sh@972 -- # wait 1630036 00:32:32.361 12:13:44 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:32:32.361 12:13:44 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:32:32.361 12:13:44 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1632798 00:32:32.361 12:13:44 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:32.361 12:13:44 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:32:32.361 12:13:44 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1632798 00:32:32.361 12:13:44 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1632798 ']' 00:32:32.361 12:13:44 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:32.361 12:13:44 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:32.361 12:13:44 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:32.361 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:32.361 12:13:44 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:32.361 12:13:44 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:32.361 [2024-07-15 12:13:44.904002] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:32:32.361 [2024-07-15 12:13:44.904074] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1632798 ] 00:32:32.361 [2024-07-15 12:13:45.038237] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:32.361 [2024-07-15 12:13:45.154952] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:32.361 [2024-07-15 12:13:45.154957] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:32.619 [2024-07-15 12:13:46.094340] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:32:32.619 12:13:46 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:32.619 12:13:46 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:32:32.619 12:13:46 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:32:32.619 12:13:46 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:32.619 12:13:46 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:35.906 [2024-07-15 12:13:49.289049] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23ba1e0 PMD being used: compress_qat 00:32:35.906 12:13:49 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:35.906 12:13:49 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:35.906 12:13:49 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:35.906 12:13:49 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:35.906 12:13:49 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:35.906 12:13:49 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:35.906 12:13:49 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:36.164 12:13:49 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:36.423 [ 00:32:36.423 { 00:32:36.423 "name": "Nvme0n1", 00:32:36.423 "aliases": [ 00:32:36.423 "5bf0e854-6c91-4bc1-9ce0-91bc53a11d71" 00:32:36.423 ], 00:32:36.423 "product_name": "NVMe disk", 00:32:36.423 "block_size": 512, 00:32:36.423 "num_blocks": 15628053168, 00:32:36.423 "uuid": "5bf0e854-6c91-4bc1-9ce0-91bc53a11d71", 00:32:36.423 "assigned_rate_limits": { 00:32:36.423 "rw_ios_per_sec": 0, 00:32:36.423 "rw_mbytes_per_sec": 0, 00:32:36.423 "r_mbytes_per_sec": 0, 00:32:36.423 "w_mbytes_per_sec": 0 00:32:36.423 }, 00:32:36.423 "claimed": false, 00:32:36.423 "zoned": false, 00:32:36.423 "supported_io_types": { 00:32:36.423 "read": true, 00:32:36.423 "write": true, 00:32:36.423 "unmap": true, 00:32:36.423 "flush": true, 00:32:36.423 "reset": true, 00:32:36.423 "nvme_admin": true, 00:32:36.423 "nvme_io": true, 00:32:36.423 "nvme_io_md": false, 00:32:36.423 "write_zeroes": true, 00:32:36.423 "zcopy": false, 00:32:36.423 "get_zone_info": false, 00:32:36.423 "zone_management": false, 00:32:36.423 "zone_append": false, 00:32:36.423 "compare": false, 00:32:36.423 "compare_and_write": false, 00:32:36.423 "abort": true, 00:32:36.423 "seek_hole": false, 00:32:36.423 "seek_data": false, 00:32:36.423 "copy": false, 00:32:36.423 "nvme_iov_md": false 00:32:36.423 }, 00:32:36.423 "driver_specific": { 00:32:36.423 "nvme": [ 00:32:36.423 { 00:32:36.423 "pci_address": "0000:5e:00.0", 00:32:36.423 "trid": { 00:32:36.423 "trtype": "PCIe", 00:32:36.423 "traddr": "0000:5e:00.0" 00:32:36.423 }, 00:32:36.423 "ctrlr_data": { 00:32:36.423 "cntlid": 0, 00:32:36.423 "vendor_id": "0x8086", 00:32:36.423 "model_number": "INTEL SSDPE2KX080T8", 00:32:36.423 "serial_number": "BTLJ817201BU8P0HGN", 00:32:36.423 "firmware_revision": "VDV10184", 00:32:36.423 "oacs": { 00:32:36.423 "security": 0, 00:32:36.423 "format": 1, 00:32:36.423 "firmware": 1, 00:32:36.423 "ns_manage": 1 00:32:36.423 }, 00:32:36.423 "multi_ctrlr": false, 00:32:36.423 "ana_reporting": false 00:32:36.423 }, 00:32:36.423 "vs": { 00:32:36.423 "nvme_version": "1.2" 00:32:36.423 }, 00:32:36.423 "ns_data": { 00:32:36.423 "id": 1, 00:32:36.423 "can_share": false 00:32:36.423 } 00:32:36.423 } 00:32:36.423 ], 00:32:36.423 "mp_policy": "active_passive" 00:32:36.423 } 00:32:36.423 } 00:32:36.423 ] 00:32:36.423 12:13:49 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:36.423 12:13:49 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:36.682 [2024-07-15 12:13:50.051246] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x22086c0 PMD being used: compress_qat 00:32:40.873 aafe32c6-04f5-4626-964d-57809f3bc534 00:32:40.873 12:13:53 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:32:40.873 2d0e57d8-d5b3-4aaa-a8eb-30792fdc80c5 00:32:40.873 12:13:53 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:32:40.873 12:13:53 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:32:40.873 12:13:53 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:40.873 12:13:53 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:40.873 12:13:53 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:40.873 12:13:53 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:40.873 12:13:53 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:40.873 12:13:54 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:32:40.873 [ 00:32:40.873 { 00:32:40.873 "name": "2d0e57d8-d5b3-4aaa-a8eb-30792fdc80c5", 00:32:40.873 "aliases": [ 00:32:40.873 "lvs0/lv0" 00:32:40.873 ], 00:32:40.873 "product_name": "Logical Volume", 00:32:40.873 "block_size": 512, 00:32:40.873 "num_blocks": 204800, 00:32:40.873 "uuid": "2d0e57d8-d5b3-4aaa-a8eb-30792fdc80c5", 00:32:40.873 "assigned_rate_limits": { 00:32:40.873 "rw_ios_per_sec": 0, 00:32:40.873 "rw_mbytes_per_sec": 0, 00:32:40.873 "r_mbytes_per_sec": 0, 00:32:40.873 "w_mbytes_per_sec": 0 00:32:40.873 }, 00:32:40.873 "claimed": false, 00:32:40.873 "zoned": false, 00:32:40.873 "supported_io_types": { 00:32:40.873 "read": true, 00:32:40.873 "write": true, 00:32:40.873 "unmap": true, 00:32:40.873 "flush": false, 00:32:40.873 "reset": true, 00:32:40.873 "nvme_admin": false, 00:32:40.873 "nvme_io": false, 00:32:40.873 "nvme_io_md": false, 00:32:40.873 "write_zeroes": true, 00:32:40.873 "zcopy": false, 00:32:40.873 "get_zone_info": false, 00:32:40.873 "zone_management": false, 00:32:40.873 "zone_append": false, 00:32:40.873 "compare": false, 00:32:40.873 "compare_and_write": false, 00:32:40.873 "abort": false, 00:32:40.873 "seek_hole": true, 00:32:40.873 "seek_data": true, 00:32:40.873 "copy": false, 00:32:40.873 "nvme_iov_md": false 00:32:40.873 }, 00:32:40.873 "driver_specific": { 00:32:40.873 "lvol": { 00:32:40.873 "lvol_store_uuid": "aafe32c6-04f5-4626-964d-57809f3bc534", 00:32:40.873 "base_bdev": "Nvme0n1", 00:32:40.873 "thin_provision": true, 00:32:40.873 "num_allocated_clusters": 0, 00:32:40.873 "snapshot": false, 00:32:40.873 "clone": false, 00:32:40.873 "esnap_clone": false 00:32:40.873 } 00:32:40.873 } 00:32:40.873 } 00:32:40.873 ] 00:32:40.873 12:13:54 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:40.873 12:13:54 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:32:40.873 12:13:54 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:32:41.131 [2024-07-15 12:13:54.666249] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:32:41.131 COMP_lvs0/lv0 00:32:41.131 12:13:54 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:32:41.131 12:13:54 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:32:41.131 12:13:54 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:41.131 12:13:54 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:41.131 12:13:54 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:41.131 12:13:54 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:41.131 12:13:54 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:41.390 12:13:54 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:32:41.649 [ 00:32:41.649 { 00:32:41.649 "name": "COMP_lvs0/lv0", 00:32:41.649 "aliases": [ 00:32:41.649 "249e2f14-b2bf-5bcd-9c69-8df17a54008d" 00:32:41.649 ], 00:32:41.649 "product_name": "compress", 00:32:41.649 "block_size": 4096, 00:32:41.649 "num_blocks": 25088, 00:32:41.649 "uuid": "249e2f14-b2bf-5bcd-9c69-8df17a54008d", 00:32:41.649 "assigned_rate_limits": { 00:32:41.649 "rw_ios_per_sec": 0, 00:32:41.649 "rw_mbytes_per_sec": 0, 00:32:41.649 "r_mbytes_per_sec": 0, 00:32:41.649 "w_mbytes_per_sec": 0 00:32:41.649 }, 00:32:41.649 "claimed": false, 00:32:41.649 "zoned": false, 00:32:41.649 "supported_io_types": { 00:32:41.649 "read": true, 00:32:41.649 "write": true, 00:32:41.649 "unmap": false, 00:32:41.649 "flush": false, 00:32:41.649 "reset": false, 00:32:41.649 "nvme_admin": false, 00:32:41.649 "nvme_io": false, 00:32:41.649 "nvme_io_md": false, 00:32:41.649 "write_zeroes": true, 00:32:41.649 "zcopy": false, 00:32:41.649 "get_zone_info": false, 00:32:41.649 "zone_management": false, 00:32:41.649 "zone_append": false, 00:32:41.649 "compare": false, 00:32:41.649 "compare_and_write": false, 00:32:41.649 "abort": false, 00:32:41.649 "seek_hole": false, 00:32:41.649 "seek_data": false, 00:32:41.649 "copy": false, 00:32:41.649 "nvme_iov_md": false 00:32:41.649 }, 00:32:41.649 "driver_specific": { 00:32:41.649 "compress": { 00:32:41.649 "name": "COMP_lvs0/lv0", 00:32:41.649 "base_bdev_name": "2d0e57d8-d5b3-4aaa-a8eb-30792fdc80c5" 00:32:41.649 } 00:32:41.649 } 00:32:41.649 } 00:32:41.649 ] 00:32:41.649 12:13:55 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:41.649 12:13:55 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:41.908 [2024-07-15 12:13:55.273048] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f22901b15c0 PMD being used: compress_qat 00:32:41.908 [2024-07-15 12:13:55.276403] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x25acd40 PMD being used: compress_qat 00:32:41.908 Running I/O for 3 seconds... 00:32:45.197 00:32:45.197 Latency(us) 00:32:45.197 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:45.197 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:32:45.197 Verification LBA range: start 0x0 length 0x3100 00:32:45.197 COMP_lvs0/lv0 : 3.01 1675.00 6.54 0.00 0.00 19002.26 343.71 22339.23 00:32:45.197 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:32:45.197 Verification LBA range: start 0x3100 length 0x3100 00:32:45.197 COMP_lvs0/lv0 : 3.01 1739.24 6.79 0.00 0.00 18272.51 297.41 21199.47 00:32:45.197 =================================================================================================================== 00:32:45.197 Total : 3414.24 13.34 0.00 0.00 18630.51 297.41 22339.23 00:32:45.197 0 00:32:45.197 12:13:58 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:32:45.197 12:13:58 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:32:45.197 12:13:58 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:32:45.197 12:13:58 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:32:45.197 12:13:58 compress_compdev -- compress/compress.sh@78 -- # killprocess 1632798 00:32:45.197 12:13:58 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1632798 ']' 00:32:45.197 12:13:58 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1632798 00:32:45.456 12:13:58 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:32:45.456 12:13:58 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:32:45.456 12:13:58 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1632798 00:32:45.456 12:13:58 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:32:45.456 12:13:58 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:32:45.456 12:13:58 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1632798' 00:32:45.456 killing process with pid 1632798 00:32:45.456 12:13:58 compress_compdev -- common/autotest_common.sh@967 -- # kill 1632798 00:32:45.456 Received shutdown signal, test time was about 3.000000 seconds 00:32:45.456 00:32:45.456 Latency(us) 00:32:45.456 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:45.456 =================================================================================================================== 00:32:45.456 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:45.456 12:13:58 compress_compdev -- common/autotest_common.sh@972 -- # wait 1632798 00:32:53.576 12:14:05 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:32:53.576 12:14:05 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:32:53.576 12:14:05 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1635437 00:32:53.576 12:14:05 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:32:53.576 12:14:05 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:32:53.576 12:14:05 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1635437 00:32:53.576 12:14:05 compress_compdev -- common/autotest_common.sh@829 -- # '[' -z 1635437 ']' 00:32:53.576 12:14:05 compress_compdev -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:53.576 12:14:05 compress_compdev -- common/autotest_common.sh@834 -- # local max_retries=100 00:32:53.576 12:14:05 compress_compdev -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:53.576 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:53.576 12:14:05 compress_compdev -- common/autotest_common.sh@838 -- # xtrace_disable 00:32:53.576 12:14:05 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:32:53.576 [2024-07-15 12:14:05.974126] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:32:53.576 [2024-07-15 12:14:05.974183] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1635437 ] 00:32:53.576 [2024-07-15 12:14:06.087954] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:53.576 [2024-07-15 12:14:06.198716] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:53.576 [2024-07-15 12:14:06.198802] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:53.576 [2024-07-15 12:14:06.198804] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:53.576 [2024-07-15 12:14:06.939922] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:32:53.576 12:14:06 compress_compdev -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:32:53.576 12:14:07 compress_compdev -- common/autotest_common.sh@862 -- # return 0 00:32:53.576 12:14:07 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:32:53.576 12:14:07 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:32:53.576 12:14:07 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:32:56.860 [2024-07-15 12:14:10.351136] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20fad00 PMD being used: compress_qat 00:32:56.860 12:14:10 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:32:56.860 12:14:10 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:32:56.860 12:14:10 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:32:56.860 12:14:10 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:32:56.860 12:14:10 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:32:56.860 12:14:10 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:32:56.860 12:14:10 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:32:57.118 12:14:10 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:32:57.377 [ 00:32:57.377 { 00:32:57.377 "name": "Nvme0n1", 00:32:57.377 "aliases": [ 00:32:57.377 "a0f054b3-7cb5-49a9-ab0a-c7532b2dda88" 00:32:57.377 ], 00:32:57.377 "product_name": "NVMe disk", 00:32:57.377 "block_size": 512, 00:32:57.377 "num_blocks": 15628053168, 00:32:57.377 "uuid": "a0f054b3-7cb5-49a9-ab0a-c7532b2dda88", 00:32:57.377 "assigned_rate_limits": { 00:32:57.377 "rw_ios_per_sec": 0, 00:32:57.377 "rw_mbytes_per_sec": 0, 00:32:57.377 "r_mbytes_per_sec": 0, 00:32:57.377 "w_mbytes_per_sec": 0 00:32:57.377 }, 00:32:57.377 "claimed": false, 00:32:57.377 "zoned": false, 00:32:57.377 "supported_io_types": { 00:32:57.377 "read": true, 00:32:57.377 "write": true, 00:32:57.377 "unmap": true, 00:32:57.377 "flush": true, 00:32:57.377 "reset": true, 00:32:57.377 "nvme_admin": true, 00:32:57.377 "nvme_io": true, 00:32:57.377 "nvme_io_md": false, 00:32:57.377 "write_zeroes": true, 00:32:57.377 "zcopy": false, 00:32:57.377 "get_zone_info": false, 00:32:57.377 "zone_management": false, 00:32:57.377 "zone_append": false, 00:32:57.377 "compare": false, 00:32:57.377 "compare_and_write": false, 00:32:57.377 "abort": true, 00:32:57.377 "seek_hole": false, 00:32:57.377 "seek_data": false, 00:32:57.377 "copy": false, 00:32:57.377 "nvme_iov_md": false 00:32:57.377 }, 00:32:57.377 "driver_specific": { 00:32:57.377 "nvme": [ 00:32:57.377 { 00:32:57.377 "pci_address": "0000:5e:00.0", 00:32:57.377 "trid": { 00:32:57.377 "trtype": "PCIe", 00:32:57.377 "traddr": "0000:5e:00.0" 00:32:57.377 }, 00:32:57.377 "ctrlr_data": { 00:32:57.377 "cntlid": 0, 00:32:57.377 "vendor_id": "0x8086", 00:32:57.377 "model_number": "INTEL SSDPE2KX080T8", 00:32:57.377 "serial_number": "BTLJ817201BU8P0HGN", 00:32:57.377 "firmware_revision": "VDV10184", 00:32:57.377 "oacs": { 00:32:57.377 "security": 0, 00:32:57.377 "format": 1, 00:32:57.377 "firmware": 1, 00:32:57.377 "ns_manage": 1 00:32:57.377 }, 00:32:57.377 "multi_ctrlr": false, 00:32:57.377 "ana_reporting": false 00:32:57.377 }, 00:32:57.377 "vs": { 00:32:57.377 "nvme_version": "1.2" 00:32:57.377 }, 00:32:57.377 "ns_data": { 00:32:57.377 "id": 1, 00:32:57.377 "can_share": false 00:32:57.377 } 00:32:57.377 } 00:32:57.377 ], 00:32:57.377 "mp_policy": "active_passive" 00:32:57.377 } 00:32:57.377 } 00:32:57.377 ] 00:32:57.377 12:14:10 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:32:57.377 12:14:10 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:32:57.377 [2024-07-15 12:14:10.912286] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f491d0 PMD being used: compress_qat 00:33:00.720 4db054b1-821f-4fa0-a003-8d6e5025aa11 00:33:00.720 12:14:14 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:33:00.978 27dbf805-d5a8-4dd9-9ee9-fde8cab2fb05 00:33:00.978 12:14:14 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:33:00.978 12:14:14 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:33:00.978 12:14:14 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:00.978 12:14:14 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:33:00.978 12:14:14 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:00.978 12:14:14 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:00.978 12:14:14 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:01.237 12:14:14 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:33:01.521 [ 00:33:01.521 { 00:33:01.522 "name": "27dbf805-d5a8-4dd9-9ee9-fde8cab2fb05", 00:33:01.522 "aliases": [ 00:33:01.522 "lvs0/lv0" 00:33:01.522 ], 00:33:01.522 "product_name": "Logical Volume", 00:33:01.522 "block_size": 512, 00:33:01.522 "num_blocks": 204800, 00:33:01.522 "uuid": "27dbf805-d5a8-4dd9-9ee9-fde8cab2fb05", 00:33:01.522 "assigned_rate_limits": { 00:33:01.522 "rw_ios_per_sec": 0, 00:33:01.522 "rw_mbytes_per_sec": 0, 00:33:01.522 "r_mbytes_per_sec": 0, 00:33:01.522 "w_mbytes_per_sec": 0 00:33:01.522 }, 00:33:01.522 "claimed": false, 00:33:01.522 "zoned": false, 00:33:01.522 "supported_io_types": { 00:33:01.522 "read": true, 00:33:01.522 "write": true, 00:33:01.522 "unmap": true, 00:33:01.522 "flush": false, 00:33:01.522 "reset": true, 00:33:01.522 "nvme_admin": false, 00:33:01.522 "nvme_io": false, 00:33:01.522 "nvme_io_md": false, 00:33:01.522 "write_zeroes": true, 00:33:01.522 "zcopy": false, 00:33:01.522 "get_zone_info": false, 00:33:01.522 "zone_management": false, 00:33:01.522 "zone_append": false, 00:33:01.522 "compare": false, 00:33:01.522 "compare_and_write": false, 00:33:01.522 "abort": false, 00:33:01.522 "seek_hole": true, 00:33:01.522 "seek_data": true, 00:33:01.522 "copy": false, 00:33:01.522 "nvme_iov_md": false 00:33:01.522 }, 00:33:01.522 "driver_specific": { 00:33:01.522 "lvol": { 00:33:01.522 "lvol_store_uuid": "4db054b1-821f-4fa0-a003-8d6e5025aa11", 00:33:01.522 "base_bdev": "Nvme0n1", 00:33:01.522 "thin_provision": true, 00:33:01.522 "num_allocated_clusters": 0, 00:33:01.522 "snapshot": false, 00:33:01.522 "clone": false, 00:33:01.522 "esnap_clone": false 00:33:01.522 } 00:33:01.522 } 00:33:01.522 } 00:33:01.522 ] 00:33:01.522 12:14:15 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:33:01.522 12:14:15 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:33:01.522 12:14:15 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:33:01.781 [2024-07-15 12:14:15.239726] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:33:01.781 COMP_lvs0/lv0 00:33:01.781 12:14:15 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:33:01.781 12:14:15 compress_compdev -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:33:01.781 12:14:15 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:01.781 12:14:15 compress_compdev -- common/autotest_common.sh@899 -- # local i 00:33:01.781 12:14:15 compress_compdev -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:01.781 12:14:15 compress_compdev -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:01.781 12:14:15 compress_compdev -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:02.040 12:14:15 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:33:02.299 [ 00:33:02.299 { 00:33:02.299 "name": "COMP_lvs0/lv0", 00:33:02.299 "aliases": [ 00:33:02.299 "4c147029-41e6-539e-87f7-441c19d1593e" 00:33:02.299 ], 00:33:02.299 "product_name": "compress", 00:33:02.299 "block_size": 512, 00:33:02.299 "num_blocks": 200704, 00:33:02.299 "uuid": "4c147029-41e6-539e-87f7-441c19d1593e", 00:33:02.299 "assigned_rate_limits": { 00:33:02.299 "rw_ios_per_sec": 0, 00:33:02.299 "rw_mbytes_per_sec": 0, 00:33:02.299 "r_mbytes_per_sec": 0, 00:33:02.299 "w_mbytes_per_sec": 0 00:33:02.299 }, 00:33:02.299 "claimed": false, 00:33:02.299 "zoned": false, 00:33:02.299 "supported_io_types": { 00:33:02.299 "read": true, 00:33:02.299 "write": true, 00:33:02.299 "unmap": false, 00:33:02.299 "flush": false, 00:33:02.299 "reset": false, 00:33:02.299 "nvme_admin": false, 00:33:02.299 "nvme_io": false, 00:33:02.299 "nvme_io_md": false, 00:33:02.299 "write_zeroes": true, 00:33:02.299 "zcopy": false, 00:33:02.299 "get_zone_info": false, 00:33:02.299 "zone_management": false, 00:33:02.299 "zone_append": false, 00:33:02.299 "compare": false, 00:33:02.299 "compare_and_write": false, 00:33:02.299 "abort": false, 00:33:02.299 "seek_hole": false, 00:33:02.299 "seek_data": false, 00:33:02.299 "copy": false, 00:33:02.299 "nvme_iov_md": false 00:33:02.299 }, 00:33:02.299 "driver_specific": { 00:33:02.300 "compress": { 00:33:02.300 "name": "COMP_lvs0/lv0", 00:33:02.300 "base_bdev_name": "27dbf805-d5a8-4dd9-9ee9-fde8cab2fb05" 00:33:02.300 } 00:33:02.300 } 00:33:02.300 } 00:33:02.300 ] 00:33:02.300 12:14:15 compress_compdev -- common/autotest_common.sh@905 -- # return 0 00:33:02.300 12:14:15 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:33:02.300 [2024-07-15 12:14:15.824658] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f221c1b1350 PMD being used: compress_qat 00:33:02.300 I/O targets: 00:33:02.300 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:33:02.300 00:33:02.300 00:33:02.300 CUnit - A unit testing framework for C - Version 2.1-3 00:33:02.300 http://cunit.sourceforge.net/ 00:33:02.300 00:33:02.300 00:33:02.300 Suite: bdevio tests on: COMP_lvs0/lv0 00:33:02.300 Test: blockdev write read block ...passed 00:33:02.300 Test: blockdev write zeroes read block ...passed 00:33:02.300 Test: blockdev write zeroes read no split ...passed 00:33:02.559 Test: blockdev write zeroes read split ...passed 00:33:02.559 Test: blockdev write zeroes read split partial ...passed 00:33:02.559 Test: blockdev reset ...[2024-07-15 12:14:15.955326] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:33:02.559 passed 00:33:02.559 Test: blockdev write read 8 blocks ...passed 00:33:02.559 Test: blockdev write read size > 128k ...passed 00:33:02.559 Test: blockdev write read invalid size ...passed 00:33:02.559 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:33:02.559 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:33:02.559 Test: blockdev write read max offset ...passed 00:33:02.559 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:33:02.559 Test: blockdev writev readv 8 blocks ...passed 00:33:02.559 Test: blockdev writev readv 30 x 1block ...passed 00:33:02.559 Test: blockdev writev readv block ...passed 00:33:02.559 Test: blockdev writev readv size > 128k ...passed 00:33:02.559 Test: blockdev writev readv size > 128k in two iovs ...passed 00:33:02.559 Test: blockdev comparev and writev ...passed 00:33:02.559 Test: blockdev nvme passthru rw ...passed 00:33:02.559 Test: blockdev nvme passthru vendor specific ...passed 00:33:02.559 Test: blockdev nvme admin passthru ...passed 00:33:02.559 Test: blockdev copy ...passed 00:33:02.559 00:33:02.559 Run Summary: Type Total Ran Passed Failed Inactive 00:33:02.559 suites 1 1 n/a 0 0 00:33:02.559 tests 23 23 23 0 0 00:33:02.559 asserts 130 130 130 0 n/a 00:33:02.559 00:33:02.559 Elapsed time = 0.365 seconds 00:33:02.559 0 00:33:02.559 12:14:16 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:33:02.559 12:14:16 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:33:02.817 12:14:16 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:33:03.075 12:14:16 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:33:03.075 12:14:16 compress_compdev -- compress/compress.sh@62 -- # killprocess 1635437 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@948 -- # '[' -z 1635437 ']' 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@952 -- # kill -0 1635437 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@953 -- # uname 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1635437 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1635437' 00:33:03.075 killing process with pid 1635437 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@967 -- # kill 1635437 00:33:03.075 12:14:16 compress_compdev -- common/autotest_common.sh@972 -- # wait 1635437 00:33:11.194 12:14:23 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:33:11.194 12:14:23 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:33:11.194 00:33:11.194 real 1m21.473s 00:33:11.194 user 3m2.388s 00:33:11.194 sys 0m6.510s 00:33:11.194 12:14:23 compress_compdev -- common/autotest_common.sh@1124 -- # xtrace_disable 00:33:11.194 12:14:23 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:33:11.194 ************************************ 00:33:11.194 END TEST compress_compdev 00:33:11.194 ************************************ 00:33:11.194 12:14:23 -- common/autotest_common.sh@1142 -- # return 0 00:33:11.194 12:14:23 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:33:11.194 12:14:23 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:33:11.194 12:14:23 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:33:11.194 12:14:23 -- common/autotest_common.sh@10 -- # set +x 00:33:11.194 ************************************ 00:33:11.194 START TEST compress_isal 00:33:11.194 ************************************ 00:33:11.194 12:14:23 compress_isal -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:33:11.194 * Looking for test storage... 00:33:11.194 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:005d867c-174e-e711-906e-0012795d9712 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=005d867c-174e-e711-906e-0012795d9712 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:33:11.194 12:14:23 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:33:11.194 12:14:23 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:33:11.194 12:14:23 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:33:11.194 12:14:23 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:11.194 12:14:23 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:11.194 12:14:23 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:11.194 12:14:23 compress_isal -- paths/export.sh@5 -- # export PATH 00:33:11.194 12:14:23 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@47 -- # : 0 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:33:11.194 12:14:23 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1637733 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:33:11.194 12:14:23 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1637733 00:33:11.194 12:14:23 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1637733 ']' 00:33:11.194 12:14:23 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:11.194 12:14:23 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:11.194 12:14:23 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:11.194 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:11.194 12:14:23 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:11.194 12:14:23 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:33:11.194 [2024-07-15 12:14:23.944631] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:33:11.194 [2024-07-15 12:14:23.944709] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1637733 ] 00:33:11.194 [2024-07-15 12:14:24.080602] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:11.194 [2024-07-15 12:14:24.196947] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:11.194 [2024-07-15 12:14:24.196956] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:11.453 12:14:24 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:11.453 12:14:24 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:33:11.453 12:14:24 compress_isal -- compress/compress.sh@74 -- # create_vols 00:33:11.453 12:14:24 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:33:11.453 12:14:24 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:33:14.743 12:14:28 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:33:14.743 12:14:28 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:33:14.743 12:14:28 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:14.743 12:14:28 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:14.743 12:14:28 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:14.743 12:14:28 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:14.743 12:14:28 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:14.743 12:14:28 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:33:15.001 [ 00:33:15.002 { 00:33:15.002 "name": "Nvme0n1", 00:33:15.002 "aliases": [ 00:33:15.002 "5a81047c-eb88-4ae0-9957-d4588d061e1c" 00:33:15.002 ], 00:33:15.002 "product_name": "NVMe disk", 00:33:15.002 "block_size": 512, 00:33:15.002 "num_blocks": 15628053168, 00:33:15.002 "uuid": "5a81047c-eb88-4ae0-9957-d4588d061e1c", 00:33:15.002 "assigned_rate_limits": { 00:33:15.002 "rw_ios_per_sec": 0, 00:33:15.002 "rw_mbytes_per_sec": 0, 00:33:15.002 "r_mbytes_per_sec": 0, 00:33:15.002 "w_mbytes_per_sec": 0 00:33:15.002 }, 00:33:15.002 "claimed": false, 00:33:15.002 "zoned": false, 00:33:15.002 "supported_io_types": { 00:33:15.002 "read": true, 00:33:15.002 "write": true, 00:33:15.002 "unmap": true, 00:33:15.002 "flush": true, 00:33:15.002 "reset": true, 00:33:15.002 "nvme_admin": true, 00:33:15.002 "nvme_io": true, 00:33:15.002 "nvme_io_md": false, 00:33:15.002 "write_zeroes": true, 00:33:15.002 "zcopy": false, 00:33:15.002 "get_zone_info": false, 00:33:15.002 "zone_management": false, 00:33:15.002 "zone_append": false, 00:33:15.002 "compare": false, 00:33:15.002 "compare_and_write": false, 00:33:15.002 "abort": true, 00:33:15.002 "seek_hole": false, 00:33:15.002 "seek_data": false, 00:33:15.002 "copy": false, 00:33:15.002 "nvme_iov_md": false 00:33:15.002 }, 00:33:15.002 "driver_specific": { 00:33:15.002 "nvme": [ 00:33:15.002 { 00:33:15.002 "pci_address": "0000:5e:00.0", 00:33:15.002 "trid": { 00:33:15.002 "trtype": "PCIe", 00:33:15.002 "traddr": "0000:5e:00.0" 00:33:15.002 }, 00:33:15.002 "ctrlr_data": { 00:33:15.002 "cntlid": 0, 00:33:15.002 "vendor_id": "0x8086", 00:33:15.002 "model_number": "INTEL SSDPE2KX080T8", 00:33:15.002 "serial_number": "BTLJ817201BU8P0HGN", 00:33:15.002 "firmware_revision": "VDV10184", 00:33:15.002 "oacs": { 00:33:15.002 "security": 0, 00:33:15.002 "format": 1, 00:33:15.002 "firmware": 1, 00:33:15.002 "ns_manage": 1 00:33:15.002 }, 00:33:15.002 "multi_ctrlr": false, 00:33:15.002 "ana_reporting": false 00:33:15.002 }, 00:33:15.002 "vs": { 00:33:15.002 "nvme_version": "1.2" 00:33:15.002 }, 00:33:15.002 "ns_data": { 00:33:15.002 "id": 1, 00:33:15.002 "can_share": false 00:33:15.002 } 00:33:15.002 } 00:33:15.002 ], 00:33:15.002 "mp_policy": "active_passive" 00:33:15.002 } 00:33:15.002 } 00:33:15.002 ] 00:33:15.002 12:14:28 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:15.002 12:14:28 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:33:19.223 627064cb-b860-4161-a55e-2fee4836e969 00:33:19.224 12:14:32 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:33:19.224 5bbd942e-72ff-49ef-8764-837a9aadcdb0 00:33:19.224 12:14:32 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:33:19.224 12:14:32 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:33:19.224 12:14:32 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:19.224 12:14:32 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:19.224 12:14:32 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:19.224 12:14:32 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:19.224 12:14:32 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:19.482 12:14:32 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:33:19.741 [ 00:33:19.741 { 00:33:19.741 "name": "5bbd942e-72ff-49ef-8764-837a9aadcdb0", 00:33:19.741 "aliases": [ 00:33:19.741 "lvs0/lv0" 00:33:19.741 ], 00:33:19.741 "product_name": "Logical Volume", 00:33:19.741 "block_size": 512, 00:33:19.741 "num_blocks": 204800, 00:33:19.741 "uuid": "5bbd942e-72ff-49ef-8764-837a9aadcdb0", 00:33:19.741 "assigned_rate_limits": { 00:33:19.741 "rw_ios_per_sec": 0, 00:33:19.741 "rw_mbytes_per_sec": 0, 00:33:19.741 "r_mbytes_per_sec": 0, 00:33:19.741 "w_mbytes_per_sec": 0 00:33:19.741 }, 00:33:19.741 "claimed": false, 00:33:19.741 "zoned": false, 00:33:19.741 "supported_io_types": { 00:33:19.741 "read": true, 00:33:19.741 "write": true, 00:33:19.741 "unmap": true, 00:33:19.741 "flush": false, 00:33:19.741 "reset": true, 00:33:19.741 "nvme_admin": false, 00:33:19.741 "nvme_io": false, 00:33:19.741 "nvme_io_md": false, 00:33:19.741 "write_zeroes": true, 00:33:19.741 "zcopy": false, 00:33:19.741 "get_zone_info": false, 00:33:19.741 "zone_management": false, 00:33:19.741 "zone_append": false, 00:33:19.741 "compare": false, 00:33:19.741 "compare_and_write": false, 00:33:19.741 "abort": false, 00:33:19.741 "seek_hole": true, 00:33:19.741 "seek_data": true, 00:33:19.741 "copy": false, 00:33:19.741 "nvme_iov_md": false 00:33:19.741 }, 00:33:19.741 "driver_specific": { 00:33:19.741 "lvol": { 00:33:19.741 "lvol_store_uuid": "627064cb-b860-4161-a55e-2fee4836e969", 00:33:19.741 "base_bdev": "Nvme0n1", 00:33:19.741 "thin_provision": true, 00:33:19.741 "num_allocated_clusters": 0, 00:33:19.741 "snapshot": false, 00:33:19.741 "clone": false, 00:33:19.741 "esnap_clone": false 00:33:19.741 } 00:33:19.741 } 00:33:19.741 } 00:33:19.741 ] 00:33:19.741 12:14:33 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:19.741 12:14:33 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:33:19.741 12:14:33 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:33:20.001 [2024-07-15 12:14:33.427968] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:33:20.001 COMP_lvs0/lv0 00:33:20.001 12:14:33 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:33:20.001 12:14:33 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:33:20.001 12:14:33 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:20.001 12:14:33 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:20.001 12:14:33 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:20.001 12:14:33 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:20.001 12:14:33 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:20.260 12:14:33 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:33:20.518 [ 00:33:20.518 { 00:33:20.518 "name": "COMP_lvs0/lv0", 00:33:20.518 "aliases": [ 00:33:20.518 "c7d2e368-7972-5196-b3d3-fa752e7eccde" 00:33:20.518 ], 00:33:20.518 "product_name": "compress", 00:33:20.518 "block_size": 512, 00:33:20.518 "num_blocks": 200704, 00:33:20.518 "uuid": "c7d2e368-7972-5196-b3d3-fa752e7eccde", 00:33:20.518 "assigned_rate_limits": { 00:33:20.518 "rw_ios_per_sec": 0, 00:33:20.518 "rw_mbytes_per_sec": 0, 00:33:20.518 "r_mbytes_per_sec": 0, 00:33:20.518 "w_mbytes_per_sec": 0 00:33:20.518 }, 00:33:20.518 "claimed": false, 00:33:20.518 "zoned": false, 00:33:20.518 "supported_io_types": { 00:33:20.518 "read": true, 00:33:20.518 "write": true, 00:33:20.518 "unmap": false, 00:33:20.518 "flush": false, 00:33:20.518 "reset": false, 00:33:20.518 "nvme_admin": false, 00:33:20.518 "nvme_io": false, 00:33:20.518 "nvme_io_md": false, 00:33:20.518 "write_zeroes": true, 00:33:20.518 "zcopy": false, 00:33:20.518 "get_zone_info": false, 00:33:20.518 "zone_management": false, 00:33:20.518 "zone_append": false, 00:33:20.518 "compare": false, 00:33:20.518 "compare_and_write": false, 00:33:20.518 "abort": false, 00:33:20.518 "seek_hole": false, 00:33:20.518 "seek_data": false, 00:33:20.518 "copy": false, 00:33:20.518 "nvme_iov_md": false 00:33:20.518 }, 00:33:20.518 "driver_specific": { 00:33:20.518 "compress": { 00:33:20.518 "name": "COMP_lvs0/lv0", 00:33:20.518 "base_bdev_name": "5bbd942e-72ff-49ef-8764-837a9aadcdb0" 00:33:20.518 } 00:33:20.518 } 00:33:20.518 } 00:33:20.518 ] 00:33:20.518 12:14:33 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:20.518 12:14:33 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:20.518 Running I/O for 3 seconds... 00:33:23.804 00:33:23.804 Latency(us) 00:33:23.804 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:23.804 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:33:23.804 Verification LBA range: start 0x0 length 0x3100 00:33:23.804 COMP_lvs0/lv0 : 3.01 1260.75 4.92 0.00 0.00 25247.05 179.87 27582.11 00:33:23.804 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:33:23.804 Verification LBA range: start 0x3100 length 0x3100 00:33:23.804 COMP_lvs0/lv0 : 3.01 1260.84 4.93 0.00 0.00 25231.88 191.44 27582.11 00:33:23.804 =================================================================================================================== 00:33:23.804 Total : 2521.58 9.85 0.00 0.00 25239.47 179.87 27582.11 00:33:23.804 0 00:33:23.804 12:14:37 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:33:23.804 12:14:37 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:33:23.804 12:14:37 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:33:24.063 12:14:37 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:33:24.063 12:14:37 compress_isal -- compress/compress.sh@78 -- # killprocess 1637733 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1637733 ']' 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1637733 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@953 -- # uname 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1637733 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1637733' 00:33:24.063 killing process with pid 1637733 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@967 -- # kill 1637733 00:33:24.063 Received shutdown signal, test time was about 3.000000 seconds 00:33:24.063 00:33:24.063 Latency(us) 00:33:24.063 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:24.063 =================================================================================================================== 00:33:24.063 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:24.063 12:14:37 compress_isal -- common/autotest_common.sh@972 -- # wait 1637733 00:33:32.176 12:14:44 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:33:32.176 12:14:44 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:33:32.176 12:14:44 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1640269 00:33:32.176 12:14:44 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:33:32.177 12:14:44 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:33:32.177 12:14:44 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1640269 00:33:32.177 12:14:44 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1640269 ']' 00:33:32.177 12:14:44 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:32.177 12:14:44 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:32.177 12:14:44 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:32.177 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:32.177 12:14:44 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:32.177 12:14:44 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:33:32.177 [2024-07-15 12:14:44.696265] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:33:32.177 [2024-07-15 12:14:44.696337] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1640269 ] 00:33:32.177 [2024-07-15 12:14:44.830867] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:32.177 [2024-07-15 12:14:44.970145] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:32.177 [2024-07-15 12:14:44.970152] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:32.177 12:14:45 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:32.177 12:14:45 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:33:32.177 12:14:45 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:33:32.177 12:14:45 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:33:32.177 12:14:45 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:33:35.465 12:14:48 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:33:35.465 12:14:48 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:33:35.465 12:14:48 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:35.465 12:14:48 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:35.465 12:14:48 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:35.465 12:14:48 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:35.465 12:14:48 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:35.465 12:14:49 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:33:35.724 [ 00:33:35.724 { 00:33:35.724 "name": "Nvme0n1", 00:33:35.724 "aliases": [ 00:33:35.724 "1eb87c32-880b-4e20-9e9c-cbf5aff3f446" 00:33:35.724 ], 00:33:35.724 "product_name": "NVMe disk", 00:33:35.724 "block_size": 512, 00:33:35.724 "num_blocks": 15628053168, 00:33:35.724 "uuid": "1eb87c32-880b-4e20-9e9c-cbf5aff3f446", 00:33:35.724 "assigned_rate_limits": { 00:33:35.724 "rw_ios_per_sec": 0, 00:33:35.724 "rw_mbytes_per_sec": 0, 00:33:35.724 "r_mbytes_per_sec": 0, 00:33:35.724 "w_mbytes_per_sec": 0 00:33:35.724 }, 00:33:35.724 "claimed": false, 00:33:35.724 "zoned": false, 00:33:35.724 "supported_io_types": { 00:33:35.724 "read": true, 00:33:35.724 "write": true, 00:33:35.724 "unmap": true, 00:33:35.724 "flush": true, 00:33:35.724 "reset": true, 00:33:35.724 "nvme_admin": true, 00:33:35.724 "nvme_io": true, 00:33:35.724 "nvme_io_md": false, 00:33:35.724 "write_zeroes": true, 00:33:35.724 "zcopy": false, 00:33:35.724 "get_zone_info": false, 00:33:35.724 "zone_management": false, 00:33:35.724 "zone_append": false, 00:33:35.724 "compare": false, 00:33:35.724 "compare_and_write": false, 00:33:35.724 "abort": true, 00:33:35.724 "seek_hole": false, 00:33:35.724 "seek_data": false, 00:33:35.724 "copy": false, 00:33:35.724 "nvme_iov_md": false 00:33:35.724 }, 00:33:35.724 "driver_specific": { 00:33:35.724 "nvme": [ 00:33:35.724 { 00:33:35.724 "pci_address": "0000:5e:00.0", 00:33:35.724 "trid": { 00:33:35.724 "trtype": "PCIe", 00:33:35.724 "traddr": "0000:5e:00.0" 00:33:35.724 }, 00:33:35.724 "ctrlr_data": { 00:33:35.724 "cntlid": 0, 00:33:35.724 "vendor_id": "0x8086", 00:33:35.724 "model_number": "INTEL SSDPE2KX080T8", 00:33:35.724 "serial_number": "BTLJ817201BU8P0HGN", 00:33:35.724 "firmware_revision": "VDV10184", 00:33:35.724 "oacs": { 00:33:35.724 "security": 0, 00:33:35.724 "format": 1, 00:33:35.724 "firmware": 1, 00:33:35.724 "ns_manage": 1 00:33:35.724 }, 00:33:35.724 "multi_ctrlr": false, 00:33:35.724 "ana_reporting": false 00:33:35.724 }, 00:33:35.724 "vs": { 00:33:35.724 "nvme_version": "1.2" 00:33:35.724 }, 00:33:35.724 "ns_data": { 00:33:35.724 "id": 1, 00:33:35.724 "can_share": false 00:33:35.724 } 00:33:35.724 } 00:33:35.724 ], 00:33:35.724 "mp_policy": "active_passive" 00:33:35.724 } 00:33:35.724 } 00:33:35.724 ] 00:33:35.724 12:14:49 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:35.724 12:14:49 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:33:39.910 375b565a-04e7-4fe7-aeb7-b20be8451adf 00:33:39.910 12:14:53 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:33:39.910 49d05141-add5-4e61-b2d9-b0a10cbbcfb8 00:33:39.910 12:14:53 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:33:39.910 12:14:53 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:33:39.910 12:14:53 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:39.910 12:14:53 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:39.910 12:14:53 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:39.910 12:14:53 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:39.910 12:14:53 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:40.168 12:14:53 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:33:40.425 [ 00:33:40.425 { 00:33:40.425 "name": "49d05141-add5-4e61-b2d9-b0a10cbbcfb8", 00:33:40.425 "aliases": [ 00:33:40.425 "lvs0/lv0" 00:33:40.425 ], 00:33:40.425 "product_name": "Logical Volume", 00:33:40.425 "block_size": 512, 00:33:40.425 "num_blocks": 204800, 00:33:40.425 "uuid": "49d05141-add5-4e61-b2d9-b0a10cbbcfb8", 00:33:40.425 "assigned_rate_limits": { 00:33:40.425 "rw_ios_per_sec": 0, 00:33:40.425 "rw_mbytes_per_sec": 0, 00:33:40.425 "r_mbytes_per_sec": 0, 00:33:40.425 "w_mbytes_per_sec": 0 00:33:40.425 }, 00:33:40.425 "claimed": false, 00:33:40.425 "zoned": false, 00:33:40.425 "supported_io_types": { 00:33:40.425 "read": true, 00:33:40.425 "write": true, 00:33:40.425 "unmap": true, 00:33:40.425 "flush": false, 00:33:40.425 "reset": true, 00:33:40.425 "nvme_admin": false, 00:33:40.425 "nvme_io": false, 00:33:40.425 "nvme_io_md": false, 00:33:40.425 "write_zeroes": true, 00:33:40.425 "zcopy": false, 00:33:40.425 "get_zone_info": false, 00:33:40.425 "zone_management": false, 00:33:40.425 "zone_append": false, 00:33:40.425 "compare": false, 00:33:40.425 "compare_and_write": false, 00:33:40.425 "abort": false, 00:33:40.425 "seek_hole": true, 00:33:40.425 "seek_data": true, 00:33:40.425 "copy": false, 00:33:40.425 "nvme_iov_md": false 00:33:40.425 }, 00:33:40.425 "driver_specific": { 00:33:40.425 "lvol": { 00:33:40.425 "lvol_store_uuid": "375b565a-04e7-4fe7-aeb7-b20be8451adf", 00:33:40.425 "base_bdev": "Nvme0n1", 00:33:40.425 "thin_provision": true, 00:33:40.425 "num_allocated_clusters": 0, 00:33:40.425 "snapshot": false, 00:33:40.425 "clone": false, 00:33:40.425 "esnap_clone": false 00:33:40.425 } 00:33:40.425 } 00:33:40.425 } 00:33:40.425 ] 00:33:40.425 12:14:53 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:40.425 12:14:53 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:33:40.425 12:14:53 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:33:40.684 [2024-07-15 12:14:54.230005] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:33:40.684 COMP_lvs0/lv0 00:33:40.684 12:14:54 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:33:40.684 12:14:54 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:33:40.684 12:14:54 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:40.684 12:14:54 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:40.684 12:14:54 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:40.684 12:14:54 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:40.684 12:14:54 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:40.941 12:14:54 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:33:41.200 [ 00:33:41.200 { 00:33:41.200 "name": "COMP_lvs0/lv0", 00:33:41.200 "aliases": [ 00:33:41.200 "1d8a1b80-1a27-509b-91a0-9c105fdf51bd" 00:33:41.200 ], 00:33:41.200 "product_name": "compress", 00:33:41.200 "block_size": 512, 00:33:41.200 "num_blocks": 200704, 00:33:41.200 "uuid": "1d8a1b80-1a27-509b-91a0-9c105fdf51bd", 00:33:41.200 "assigned_rate_limits": { 00:33:41.200 "rw_ios_per_sec": 0, 00:33:41.200 "rw_mbytes_per_sec": 0, 00:33:41.200 "r_mbytes_per_sec": 0, 00:33:41.200 "w_mbytes_per_sec": 0 00:33:41.200 }, 00:33:41.200 "claimed": false, 00:33:41.200 "zoned": false, 00:33:41.200 "supported_io_types": { 00:33:41.200 "read": true, 00:33:41.200 "write": true, 00:33:41.200 "unmap": false, 00:33:41.200 "flush": false, 00:33:41.200 "reset": false, 00:33:41.200 "nvme_admin": false, 00:33:41.200 "nvme_io": false, 00:33:41.200 "nvme_io_md": false, 00:33:41.200 "write_zeroes": true, 00:33:41.200 "zcopy": false, 00:33:41.200 "get_zone_info": false, 00:33:41.200 "zone_management": false, 00:33:41.200 "zone_append": false, 00:33:41.200 "compare": false, 00:33:41.200 "compare_and_write": false, 00:33:41.200 "abort": false, 00:33:41.200 "seek_hole": false, 00:33:41.200 "seek_data": false, 00:33:41.200 "copy": false, 00:33:41.200 "nvme_iov_md": false 00:33:41.200 }, 00:33:41.200 "driver_specific": { 00:33:41.200 "compress": { 00:33:41.200 "name": "COMP_lvs0/lv0", 00:33:41.200 "base_bdev_name": "49d05141-add5-4e61-b2d9-b0a10cbbcfb8" 00:33:41.200 } 00:33:41.200 } 00:33:41.200 } 00:33:41.200 ] 00:33:41.200 12:14:54 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:41.200 12:14:54 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:41.469 Running I/O for 3 seconds... 00:33:44.807 00:33:44.807 Latency(us) 00:33:44.807 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:44.807 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:33:44.807 Verification LBA range: start 0x0 length 0x3100 00:33:44.807 COMP_lvs0/lv0 : 3.02 1256.80 4.91 0.00 0.00 25320.36 169.18 28835.84 00:33:44.807 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:33:44.807 Verification LBA range: start 0x3100 length 0x3100 00:33:44.807 COMP_lvs0/lv0 : 3.02 1263.73 4.94 0.00 0.00 25156.26 292.06 28721.86 00:33:44.807 =================================================================================================================== 00:33:44.807 Total : 2520.53 9.85 0.00 0.00 25238.08 169.18 28835.84 00:33:44.807 0 00:33:44.807 12:14:57 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:33:44.807 12:14:57 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:33:44.807 12:14:58 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:33:45.066 12:14:58 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:33:45.066 12:14:58 compress_isal -- compress/compress.sh@78 -- # killprocess 1640269 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1640269 ']' 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1640269 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@953 -- # uname 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1640269 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1640269' 00:33:45.066 killing process with pid 1640269 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@967 -- # kill 1640269 00:33:45.066 Received shutdown signal, test time was about 3.000000 seconds 00:33:45.066 00:33:45.066 Latency(us) 00:33:45.066 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:45.066 =================================================================================================================== 00:33:45.066 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:45.066 12:14:58 compress_isal -- common/autotest_common.sh@972 -- # wait 1640269 00:33:53.177 12:15:05 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:33:53.177 12:15:05 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:33:53.177 12:15:05 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1643304 00:33:53.177 12:15:05 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:33:53.177 12:15:05 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:33:53.177 12:15:05 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1643304 00:33:53.177 12:15:05 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1643304 ']' 00:33:53.177 12:15:05 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:53.177 12:15:05 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:33:53.177 12:15:05 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:53.177 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:53.177 12:15:05 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:33:53.177 12:15:05 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:33:53.177 [2024-07-15 12:15:05.643611] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:33:53.177 [2024-07-15 12:15:05.643709] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1643304 ] 00:33:53.177 [2024-07-15 12:15:05.780061] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:53.177 [2024-07-15 12:15:05.902994] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:33:53.177 [2024-07-15 12:15:05.903000] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:53.177 12:15:06 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:33:53.177 12:15:06 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:33:53.177 12:15:06 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:33:53.177 12:15:06 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:33:53.177 12:15:06 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:33:56.464 12:15:09 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:33:56.464 12:15:09 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:33:56.464 12:15:09 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:33:56.464 12:15:09 compress_isal -- common/autotest_common.sh@899 -- # local i 00:33:56.464 12:15:09 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:33:56.464 12:15:09 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:33:56.464 12:15:09 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:33:56.464 12:15:10 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:33:56.723 [ 00:33:56.723 { 00:33:56.723 "name": "Nvme0n1", 00:33:56.723 "aliases": [ 00:33:56.723 "cf7a76d1-2e15-4d0a-b9c6-b2dac09eda84" 00:33:56.723 ], 00:33:56.723 "product_name": "NVMe disk", 00:33:56.723 "block_size": 512, 00:33:56.723 "num_blocks": 15628053168, 00:33:56.723 "uuid": "cf7a76d1-2e15-4d0a-b9c6-b2dac09eda84", 00:33:56.723 "assigned_rate_limits": { 00:33:56.723 "rw_ios_per_sec": 0, 00:33:56.723 "rw_mbytes_per_sec": 0, 00:33:56.723 "r_mbytes_per_sec": 0, 00:33:56.723 "w_mbytes_per_sec": 0 00:33:56.723 }, 00:33:56.723 "claimed": false, 00:33:56.723 "zoned": false, 00:33:56.723 "supported_io_types": { 00:33:56.723 "read": true, 00:33:56.723 "write": true, 00:33:56.723 "unmap": true, 00:33:56.723 "flush": true, 00:33:56.723 "reset": true, 00:33:56.723 "nvme_admin": true, 00:33:56.723 "nvme_io": true, 00:33:56.723 "nvme_io_md": false, 00:33:56.723 "write_zeroes": true, 00:33:56.723 "zcopy": false, 00:33:56.723 "get_zone_info": false, 00:33:56.723 "zone_management": false, 00:33:56.723 "zone_append": false, 00:33:56.723 "compare": false, 00:33:56.723 "compare_and_write": false, 00:33:56.723 "abort": true, 00:33:56.723 "seek_hole": false, 00:33:56.723 "seek_data": false, 00:33:56.723 "copy": false, 00:33:56.723 "nvme_iov_md": false 00:33:56.723 }, 00:33:56.723 "driver_specific": { 00:33:56.723 "nvme": [ 00:33:56.723 { 00:33:56.723 "pci_address": "0000:5e:00.0", 00:33:56.723 "trid": { 00:33:56.723 "trtype": "PCIe", 00:33:56.723 "traddr": "0000:5e:00.0" 00:33:56.723 }, 00:33:56.723 "ctrlr_data": { 00:33:56.723 "cntlid": 0, 00:33:56.723 "vendor_id": "0x8086", 00:33:56.723 "model_number": "INTEL SSDPE2KX080T8", 00:33:56.723 "serial_number": "BTLJ817201BU8P0HGN", 00:33:56.723 "firmware_revision": "VDV10184", 00:33:56.723 "oacs": { 00:33:56.723 "security": 0, 00:33:56.723 "format": 1, 00:33:56.723 "firmware": 1, 00:33:56.723 "ns_manage": 1 00:33:56.723 }, 00:33:56.723 "multi_ctrlr": false, 00:33:56.723 "ana_reporting": false 00:33:56.723 }, 00:33:56.723 "vs": { 00:33:56.723 "nvme_version": "1.2" 00:33:56.723 }, 00:33:56.723 "ns_data": { 00:33:56.723 "id": 1, 00:33:56.723 "can_share": false 00:33:56.723 } 00:33:56.723 } 00:33:56.723 ], 00:33:56.723 "mp_policy": "active_passive" 00:33:56.723 } 00:33:56.723 } 00:33:56.723 ] 00:33:56.723 12:15:10 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:33:56.723 12:15:10 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:34:00.912 2441850a-cd05-4b60-94b8-9687931ce0ab 00:34:00.912 12:15:14 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:34:00.912 6f32919b-1541-4e72-b8cb-f59f79f70998 00:34:00.912 12:15:14 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:34:00.912 12:15:14 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:34:00.912 12:15:14 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:34:00.912 12:15:14 compress_isal -- common/autotest_common.sh@899 -- # local i 00:34:00.912 12:15:14 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:34:00.912 12:15:14 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:34:00.912 12:15:14 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:01.170 12:15:14 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:34:01.429 [ 00:34:01.429 { 00:34:01.429 "name": "6f32919b-1541-4e72-b8cb-f59f79f70998", 00:34:01.429 "aliases": [ 00:34:01.429 "lvs0/lv0" 00:34:01.429 ], 00:34:01.429 "product_name": "Logical Volume", 00:34:01.429 "block_size": 512, 00:34:01.429 "num_blocks": 204800, 00:34:01.429 "uuid": "6f32919b-1541-4e72-b8cb-f59f79f70998", 00:34:01.429 "assigned_rate_limits": { 00:34:01.429 "rw_ios_per_sec": 0, 00:34:01.429 "rw_mbytes_per_sec": 0, 00:34:01.429 "r_mbytes_per_sec": 0, 00:34:01.429 "w_mbytes_per_sec": 0 00:34:01.429 }, 00:34:01.429 "claimed": false, 00:34:01.429 "zoned": false, 00:34:01.429 "supported_io_types": { 00:34:01.429 "read": true, 00:34:01.429 "write": true, 00:34:01.429 "unmap": true, 00:34:01.429 "flush": false, 00:34:01.429 "reset": true, 00:34:01.429 "nvme_admin": false, 00:34:01.429 "nvme_io": false, 00:34:01.429 "nvme_io_md": false, 00:34:01.429 "write_zeroes": true, 00:34:01.429 "zcopy": false, 00:34:01.429 "get_zone_info": false, 00:34:01.429 "zone_management": false, 00:34:01.429 "zone_append": false, 00:34:01.429 "compare": false, 00:34:01.429 "compare_and_write": false, 00:34:01.429 "abort": false, 00:34:01.429 "seek_hole": true, 00:34:01.429 "seek_data": true, 00:34:01.430 "copy": false, 00:34:01.430 "nvme_iov_md": false 00:34:01.430 }, 00:34:01.430 "driver_specific": { 00:34:01.430 "lvol": { 00:34:01.430 "lvol_store_uuid": "2441850a-cd05-4b60-94b8-9687931ce0ab", 00:34:01.430 "base_bdev": "Nvme0n1", 00:34:01.430 "thin_provision": true, 00:34:01.430 "num_allocated_clusters": 0, 00:34:01.430 "snapshot": false, 00:34:01.430 "clone": false, 00:34:01.430 "esnap_clone": false 00:34:01.430 } 00:34:01.430 } 00:34:01.430 } 00:34:01.430 ] 00:34:01.430 12:15:14 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:34:01.430 12:15:14 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:34:01.430 12:15:14 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:34:01.687 [2024-07-15 12:15:15.091350] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:34:01.687 COMP_lvs0/lv0 00:34:01.688 12:15:15 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:34:01.688 12:15:15 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:34:01.688 12:15:15 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:34:01.688 12:15:15 compress_isal -- common/autotest_common.sh@899 -- # local i 00:34:01.688 12:15:15 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:34:01.688 12:15:15 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:34:01.688 12:15:15 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:01.946 12:15:15 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:34:02.205 [ 00:34:02.205 { 00:34:02.205 "name": "COMP_lvs0/lv0", 00:34:02.205 "aliases": [ 00:34:02.205 "69fea2d3-4f82-5bf8-b32c-c39715acde84" 00:34:02.205 ], 00:34:02.205 "product_name": "compress", 00:34:02.205 "block_size": 4096, 00:34:02.205 "num_blocks": 25088, 00:34:02.205 "uuid": "69fea2d3-4f82-5bf8-b32c-c39715acde84", 00:34:02.205 "assigned_rate_limits": { 00:34:02.205 "rw_ios_per_sec": 0, 00:34:02.205 "rw_mbytes_per_sec": 0, 00:34:02.205 "r_mbytes_per_sec": 0, 00:34:02.205 "w_mbytes_per_sec": 0 00:34:02.205 }, 00:34:02.205 "claimed": false, 00:34:02.205 "zoned": false, 00:34:02.205 "supported_io_types": { 00:34:02.205 "read": true, 00:34:02.205 "write": true, 00:34:02.205 "unmap": false, 00:34:02.205 "flush": false, 00:34:02.205 "reset": false, 00:34:02.205 "nvme_admin": false, 00:34:02.205 "nvme_io": false, 00:34:02.205 "nvme_io_md": false, 00:34:02.205 "write_zeroes": true, 00:34:02.205 "zcopy": false, 00:34:02.205 "get_zone_info": false, 00:34:02.205 "zone_management": false, 00:34:02.205 "zone_append": false, 00:34:02.205 "compare": false, 00:34:02.205 "compare_and_write": false, 00:34:02.205 "abort": false, 00:34:02.205 "seek_hole": false, 00:34:02.205 "seek_data": false, 00:34:02.205 "copy": false, 00:34:02.205 "nvme_iov_md": false 00:34:02.205 }, 00:34:02.205 "driver_specific": { 00:34:02.205 "compress": { 00:34:02.205 "name": "COMP_lvs0/lv0", 00:34:02.205 "base_bdev_name": "6f32919b-1541-4e72-b8cb-f59f79f70998" 00:34:02.205 } 00:34:02.205 } 00:34:02.205 } 00:34:02.205 ] 00:34:02.205 12:15:15 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:34:02.205 12:15:15 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:34:02.205 Running I/O for 3 seconds... 00:34:05.496 00:34:05.496 Latency(us) 00:34:05.496 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:05.496 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:34:05.496 Verification LBA range: start 0x0 length 0x3100 00:34:05.496 COMP_lvs0/lv0 : 3.01 1262.09 4.93 0.00 0.00 25227.76 562.75 27810.06 00:34:05.496 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:34:05.496 Verification LBA range: start 0x3100 length 0x3100 00:34:05.496 COMP_lvs0/lv0 : 3.02 1278.19 4.99 0.00 0.00 24867.01 190.55 26784.28 00:34:05.496 =================================================================================================================== 00:34:05.496 Total : 2540.28 9.92 0.00 0.00 25046.16 190.55 27810.06 00:34:05.496 0 00:34:05.496 12:15:18 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:34:05.496 12:15:18 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:34:05.496 12:15:19 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:34:05.755 12:15:19 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:34:05.755 12:15:19 compress_isal -- compress/compress.sh@78 -- # killprocess 1643304 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1643304 ']' 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1643304 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@953 -- # uname 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1643304 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1643304' 00:34:05.755 killing process with pid 1643304 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@967 -- # kill 1643304 00:34:05.755 Received shutdown signal, test time was about 3.000000 seconds 00:34:05.755 00:34:05.755 Latency(us) 00:34:05.755 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:05.755 =================================================================================================================== 00:34:05.755 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:05.755 12:15:19 compress_isal -- common/autotest_common.sh@972 -- # wait 1643304 00:34:13.874 12:15:26 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:34:13.874 12:15:26 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:34:13.874 12:15:26 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1646029 00:34:13.874 12:15:26 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:34:13.874 12:15:26 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:34:13.874 12:15:26 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1646029 00:34:13.874 12:15:26 compress_isal -- common/autotest_common.sh@829 -- # '[' -z 1646029 ']' 00:34:13.874 12:15:26 compress_isal -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:13.874 12:15:26 compress_isal -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:13.874 12:15:26 compress_isal -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:13.874 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:13.874 12:15:26 compress_isal -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:13.874 12:15:26 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:34:13.874 [2024-07-15 12:15:26.405828] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:34:13.874 [2024-07-15 12:15:26.405902] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1646029 ] 00:34:13.874 [2024-07-15 12:15:26.534756] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:13.874 [2024-07-15 12:15:26.638835] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:13.874 [2024-07-15 12:15:26.638933] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:13.874 [2024-07-15 12:15:26.638936] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:13.874 12:15:27 compress_isal -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:13.874 12:15:27 compress_isal -- common/autotest_common.sh@862 -- # return 0 00:34:13.874 12:15:27 compress_isal -- compress/compress.sh@58 -- # create_vols 00:34:13.874 12:15:27 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:34:13.874 12:15:27 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:34:17.170 12:15:30 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:34:17.170 12:15:30 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=Nvme0n1 00:34:17.170 12:15:30 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:34:17.170 12:15:30 compress_isal -- common/autotest_common.sh@899 -- # local i 00:34:17.170 12:15:30 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:34:17.170 12:15:30 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:34:17.170 12:15:30 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:17.170 12:15:30 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:34:17.429 [ 00:34:17.429 { 00:34:17.429 "name": "Nvme0n1", 00:34:17.429 "aliases": [ 00:34:17.429 "c1b3c70f-1384-4280-a0f5-18512d153882" 00:34:17.429 ], 00:34:17.429 "product_name": "NVMe disk", 00:34:17.429 "block_size": 512, 00:34:17.429 "num_blocks": 15628053168, 00:34:17.429 "uuid": "c1b3c70f-1384-4280-a0f5-18512d153882", 00:34:17.429 "assigned_rate_limits": { 00:34:17.429 "rw_ios_per_sec": 0, 00:34:17.429 "rw_mbytes_per_sec": 0, 00:34:17.429 "r_mbytes_per_sec": 0, 00:34:17.429 "w_mbytes_per_sec": 0 00:34:17.429 }, 00:34:17.429 "claimed": false, 00:34:17.429 "zoned": false, 00:34:17.429 "supported_io_types": { 00:34:17.429 "read": true, 00:34:17.429 "write": true, 00:34:17.429 "unmap": true, 00:34:17.429 "flush": true, 00:34:17.429 "reset": true, 00:34:17.429 "nvme_admin": true, 00:34:17.429 "nvme_io": true, 00:34:17.429 "nvme_io_md": false, 00:34:17.429 "write_zeroes": true, 00:34:17.429 "zcopy": false, 00:34:17.429 "get_zone_info": false, 00:34:17.429 "zone_management": false, 00:34:17.429 "zone_append": false, 00:34:17.429 "compare": false, 00:34:17.429 "compare_and_write": false, 00:34:17.429 "abort": true, 00:34:17.429 "seek_hole": false, 00:34:17.429 "seek_data": false, 00:34:17.429 "copy": false, 00:34:17.429 "nvme_iov_md": false 00:34:17.429 }, 00:34:17.429 "driver_specific": { 00:34:17.429 "nvme": [ 00:34:17.429 { 00:34:17.429 "pci_address": "0000:5e:00.0", 00:34:17.429 "trid": { 00:34:17.429 "trtype": "PCIe", 00:34:17.429 "traddr": "0000:5e:00.0" 00:34:17.429 }, 00:34:17.429 "ctrlr_data": { 00:34:17.429 "cntlid": 0, 00:34:17.429 "vendor_id": "0x8086", 00:34:17.429 "model_number": "INTEL SSDPE2KX080T8", 00:34:17.429 "serial_number": "BTLJ817201BU8P0HGN", 00:34:17.429 "firmware_revision": "VDV10184", 00:34:17.429 "oacs": { 00:34:17.429 "security": 0, 00:34:17.429 "format": 1, 00:34:17.429 "firmware": 1, 00:34:17.429 "ns_manage": 1 00:34:17.429 }, 00:34:17.429 "multi_ctrlr": false, 00:34:17.429 "ana_reporting": false 00:34:17.429 }, 00:34:17.429 "vs": { 00:34:17.429 "nvme_version": "1.2" 00:34:17.429 }, 00:34:17.429 "ns_data": { 00:34:17.429 "id": 1, 00:34:17.429 "can_share": false 00:34:17.429 } 00:34:17.429 } 00:34:17.429 ], 00:34:17.429 "mp_policy": "active_passive" 00:34:17.429 } 00:34:17.429 } 00:34:17.429 ] 00:34:17.429 12:15:30 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:34:17.429 12:15:30 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:34:21.617 d6609aed-f496-4d35-a3be-4f38ddb17d28 00:34:21.617 12:15:34 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:34:21.617 09931591-6d43-434d-9557-cac5d8b77473 00:34:21.617 12:15:34 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:34:21.617 12:15:34 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=lvs0/lv0 00:34:21.617 12:15:34 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:34:21.617 12:15:34 compress_isal -- common/autotest_common.sh@899 -- # local i 00:34:21.617 12:15:34 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:34:21.617 12:15:34 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:34:21.617 12:15:34 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:21.617 12:15:34 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:34:21.617 [ 00:34:21.617 { 00:34:21.617 "name": "09931591-6d43-434d-9557-cac5d8b77473", 00:34:21.617 "aliases": [ 00:34:21.617 "lvs0/lv0" 00:34:21.617 ], 00:34:21.617 "product_name": "Logical Volume", 00:34:21.617 "block_size": 512, 00:34:21.617 "num_blocks": 204800, 00:34:21.617 "uuid": "09931591-6d43-434d-9557-cac5d8b77473", 00:34:21.617 "assigned_rate_limits": { 00:34:21.617 "rw_ios_per_sec": 0, 00:34:21.617 "rw_mbytes_per_sec": 0, 00:34:21.617 "r_mbytes_per_sec": 0, 00:34:21.617 "w_mbytes_per_sec": 0 00:34:21.617 }, 00:34:21.617 "claimed": false, 00:34:21.617 "zoned": false, 00:34:21.617 "supported_io_types": { 00:34:21.617 "read": true, 00:34:21.617 "write": true, 00:34:21.617 "unmap": true, 00:34:21.617 "flush": false, 00:34:21.617 "reset": true, 00:34:21.617 "nvme_admin": false, 00:34:21.617 "nvme_io": false, 00:34:21.617 "nvme_io_md": false, 00:34:21.617 "write_zeroes": true, 00:34:21.617 "zcopy": false, 00:34:21.617 "get_zone_info": false, 00:34:21.617 "zone_management": false, 00:34:21.617 "zone_append": false, 00:34:21.617 "compare": false, 00:34:21.617 "compare_and_write": false, 00:34:21.617 "abort": false, 00:34:21.617 "seek_hole": true, 00:34:21.617 "seek_data": true, 00:34:21.617 "copy": false, 00:34:21.617 "nvme_iov_md": false 00:34:21.617 }, 00:34:21.617 "driver_specific": { 00:34:21.617 "lvol": { 00:34:21.617 "lvol_store_uuid": "d6609aed-f496-4d35-a3be-4f38ddb17d28", 00:34:21.617 "base_bdev": "Nvme0n1", 00:34:21.617 "thin_provision": true, 00:34:21.617 "num_allocated_clusters": 0, 00:34:21.617 "snapshot": false, 00:34:21.617 "clone": false, 00:34:21.617 "esnap_clone": false 00:34:21.617 } 00:34:21.617 } 00:34:21.617 } 00:34:21.617 ] 00:34:21.617 12:15:35 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:34:21.617 12:15:35 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:34:21.617 12:15:35 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:34:21.875 [2024-07-15 12:15:35.419175] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:34:21.875 COMP_lvs0/lv0 00:34:21.875 12:15:35 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:34:21.875 12:15:35 compress_isal -- common/autotest_common.sh@897 -- # local bdev_name=COMP_lvs0/lv0 00:34:21.875 12:15:35 compress_isal -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:34:21.875 12:15:35 compress_isal -- common/autotest_common.sh@899 -- # local i 00:34:21.875 12:15:35 compress_isal -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:34:21.875 12:15:35 compress_isal -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:34:21.875 12:15:35 compress_isal -- common/autotest_common.sh@902 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:34:22.133 12:15:35 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:34:22.392 [ 00:34:22.392 { 00:34:22.392 "name": "COMP_lvs0/lv0", 00:34:22.392 "aliases": [ 00:34:22.392 "5e15d0dc-6865-5cc6-acc2-bcfc87bbcaae" 00:34:22.392 ], 00:34:22.392 "product_name": "compress", 00:34:22.392 "block_size": 512, 00:34:22.392 "num_blocks": 200704, 00:34:22.392 "uuid": "5e15d0dc-6865-5cc6-acc2-bcfc87bbcaae", 00:34:22.392 "assigned_rate_limits": { 00:34:22.392 "rw_ios_per_sec": 0, 00:34:22.392 "rw_mbytes_per_sec": 0, 00:34:22.392 "r_mbytes_per_sec": 0, 00:34:22.392 "w_mbytes_per_sec": 0 00:34:22.392 }, 00:34:22.392 "claimed": false, 00:34:22.392 "zoned": false, 00:34:22.392 "supported_io_types": { 00:34:22.392 "read": true, 00:34:22.392 "write": true, 00:34:22.392 "unmap": false, 00:34:22.392 "flush": false, 00:34:22.392 "reset": false, 00:34:22.392 "nvme_admin": false, 00:34:22.392 "nvme_io": false, 00:34:22.392 "nvme_io_md": false, 00:34:22.392 "write_zeroes": true, 00:34:22.392 "zcopy": false, 00:34:22.392 "get_zone_info": false, 00:34:22.392 "zone_management": false, 00:34:22.392 "zone_append": false, 00:34:22.392 "compare": false, 00:34:22.392 "compare_and_write": false, 00:34:22.392 "abort": false, 00:34:22.392 "seek_hole": false, 00:34:22.392 "seek_data": false, 00:34:22.392 "copy": false, 00:34:22.392 "nvme_iov_md": false 00:34:22.392 }, 00:34:22.392 "driver_specific": { 00:34:22.392 "compress": { 00:34:22.392 "name": "COMP_lvs0/lv0", 00:34:22.392 "base_bdev_name": "09931591-6d43-434d-9557-cac5d8b77473" 00:34:22.392 } 00:34:22.392 } 00:34:22.392 } 00:34:22.392 ] 00:34:22.392 12:15:35 compress_isal -- common/autotest_common.sh@905 -- # return 0 00:34:22.392 12:15:35 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:22.651 I/O targets: 00:34:22.651 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:34:22.651 00:34:22.651 00:34:22.651 CUnit - A unit testing framework for C - Version 2.1-3 00:34:22.651 http://cunit.sourceforge.net/ 00:34:22.651 00:34:22.651 00:34:22.651 Suite: bdevio tests on: COMP_lvs0/lv0 00:34:22.651 Test: blockdev write read block ...passed 00:34:22.651 Test: blockdev write zeroes read block ...passed 00:34:22.651 Test: blockdev write zeroes read no split ...passed 00:34:22.651 Test: blockdev write zeroes read split ...passed 00:34:22.651 Test: blockdev write zeroes read split partial ...passed 00:34:22.651 Test: blockdev reset ...[2024-07-15 12:15:36.166856] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:34:22.651 passed 00:34:22.651 Test: blockdev write read 8 blocks ...passed 00:34:22.651 Test: blockdev write read size > 128k ...passed 00:34:22.651 Test: blockdev write read invalid size ...passed 00:34:22.651 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:22.651 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:22.651 Test: blockdev write read max offset ...passed 00:34:22.651 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:22.651 Test: blockdev writev readv 8 blocks ...passed 00:34:22.651 Test: blockdev writev readv 30 x 1block ...passed 00:34:22.651 Test: blockdev writev readv block ...passed 00:34:22.651 Test: blockdev writev readv size > 128k ...passed 00:34:22.651 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:22.651 Test: blockdev comparev and writev ...passed 00:34:22.651 Test: blockdev nvme passthru rw ...passed 00:34:22.651 Test: blockdev nvme passthru vendor specific ...passed 00:34:22.651 Test: blockdev nvme admin passthru ...passed 00:34:22.651 Test: blockdev copy ...passed 00:34:22.651 00:34:22.651 Run Summary: Type Total Ran Passed Failed Inactive 00:34:22.651 suites 1 1 n/a 0 0 00:34:22.651 tests 23 23 23 0 0 00:34:22.651 asserts 130 130 130 0 n/a 00:34:22.651 00:34:22.651 Elapsed time = 0.400 seconds 00:34:22.651 0 00:34:22.651 12:15:36 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:34:22.651 12:15:36 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:34:22.910 12:15:36 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:34:23.168 12:15:36 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:34:23.168 12:15:36 compress_isal -- compress/compress.sh@62 -- # killprocess 1646029 00:34:23.168 12:15:36 compress_isal -- common/autotest_common.sh@948 -- # '[' -z 1646029 ']' 00:34:23.168 12:15:36 compress_isal -- common/autotest_common.sh@952 -- # kill -0 1646029 00:34:23.168 12:15:36 compress_isal -- common/autotest_common.sh@953 -- # uname 00:34:23.168 12:15:36 compress_isal -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:23.169 12:15:36 compress_isal -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1646029 00:34:23.169 12:15:36 compress_isal -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:23.169 12:15:36 compress_isal -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:23.169 12:15:36 compress_isal -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1646029' 00:34:23.169 killing process with pid 1646029 00:34:23.169 12:15:36 compress_isal -- common/autotest_common.sh@967 -- # kill 1646029 00:34:23.169 12:15:36 compress_isal -- common/autotest_common.sh@972 -- # wait 1646029 00:34:31.327 12:15:43 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:34:31.327 12:15:43 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:34:31.327 00:34:31.327 real 1m19.996s 00:34:31.327 user 3m0.884s 00:34:31.327 sys 0m4.641s 00:34:31.327 12:15:43 compress_isal -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:31.327 12:15:43 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:34:31.327 ************************************ 00:34:31.327 END TEST compress_isal 00:34:31.327 ************************************ 00:34:31.327 12:15:43 -- common/autotest_common.sh@1142 -- # return 0 00:34:31.327 12:15:43 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:34:31.327 12:15:43 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:34:31.327 12:15:43 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:34:31.327 12:15:43 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:31.327 12:15:43 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:31.327 12:15:43 -- common/autotest_common.sh@10 -- # set +x 00:34:31.327 ************************************ 00:34:31.327 START TEST blockdev_crypto_aesni 00:34:31.327 ************************************ 00:34:31.327 12:15:43 blockdev_crypto_aesni -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:34:31.327 * Looking for test storage... 00:34:31.327 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1648348 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:34:31.327 12:15:43 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1648348 00:34:31.327 12:15:43 blockdev_crypto_aesni -- common/autotest_common.sh@829 -- # '[' -z 1648348 ']' 00:34:31.327 12:15:43 blockdev_crypto_aesni -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:31.327 12:15:43 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:31.327 12:15:43 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:31.327 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:31.327 12:15:43 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:31.327 12:15:43 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:31.327 [2024-07-15 12:15:44.005945] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:34:31.328 [2024-07-15 12:15:44.006022] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1648348 ] 00:34:31.328 [2024-07-15 12:15:44.125144] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:31.328 [2024-07-15 12:15:44.232266] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:31.595 12:15:44 blockdev_crypto_aesni -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:31.595 12:15:44 blockdev_crypto_aesni -- common/autotest_common.sh@862 -- # return 0 00:34:31.595 12:15:44 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:34:31.595 12:15:44 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:34:31.595 12:15:44 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:34:31.595 12:15:44 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:31.595 12:15:44 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:31.595 [2024-07-15 12:15:44.994625] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:31.595 [2024-07-15 12:15:45.002657] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:31.595 [2024-07-15 12:15:45.010676] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:31.595 [2024-07-15 12:15:45.077280] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:34.148 true 00:34:34.148 true 00:34:34.148 true 00:34:34.148 true 00:34:34.148 Malloc0 00:34:34.148 Malloc1 00:34:34.148 Malloc2 00:34:34.148 Malloc3 00:34:34.148 [2024-07-15 12:15:47.454212] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:34.148 crypto_ram 00:34:34.148 [2024-07-15 12:15:47.462229] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:34.148 crypto_ram2 00:34:34.148 [2024-07-15 12:15:47.470248] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:34.148 crypto_ram3 00:34:34.148 [2024-07-15 12:15:47.478272] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:34.148 crypto_ram4 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@559 -- # xtrace_disable 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "8ca3478e-8424-5f0e-9080-13cc5d315298"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8ca3478e-8424-5f0e-9080-13cc5d315298",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "00c3603b-7cd9-5213-bb6a-d705a2657c53"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "00c3603b-7cd9-5213-bb6a-d705a2657c53",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "14e4c5a7-ddc7-5902-9158-96f87e78c12b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "14e4c5a7-ddc7-5902-9158-96f87e78c12b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "c9b85c73-7e99-596d-bba1-b1150239e0ac"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c9b85c73-7e99-596d-bba1-b1150239e0ac",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:34:34.148 12:15:47 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 1648348 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@948 -- # '[' -z 1648348 ']' 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@952 -- # kill -0 1648348 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # uname 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:34.148 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1648348 00:34:34.407 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:34.407 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:34.407 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1648348' 00:34:34.407 killing process with pid 1648348 00:34:34.407 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # kill 1648348 00:34:34.407 12:15:47 blockdev_crypto_aesni -- common/autotest_common.sh@972 -- # wait 1648348 00:34:34.974 12:15:48 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:34:34.974 12:15:48 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:34.974 12:15:48 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:34:34.974 12:15:48 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:34.974 12:15:48 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:34.974 ************************************ 00:34:34.974 START TEST bdev_hello_world 00:34:34.974 ************************************ 00:34:34.974 12:15:48 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:34:34.974 [2024-07-15 12:15:48.378625] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:34:34.974 [2024-07-15 12:15:48.378683] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1648915 ] 00:34:34.974 [2024-07-15 12:15:48.505023] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:35.232 [2024-07-15 12:15:48.606117] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:35.232 [2024-07-15 12:15:48.627392] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:35.232 [2024-07-15 12:15:48.635419] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:35.232 [2024-07-15 12:15:48.643438] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:35.232 [2024-07-15 12:15:48.748463] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:37.764 [2024-07-15 12:15:50.976186] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:37.764 [2024-07-15 12:15:50.976263] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:37.764 [2024-07-15 12:15:50.976278] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:37.764 [2024-07-15 12:15:50.984205] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:37.764 [2024-07-15 12:15:50.984224] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:37.764 [2024-07-15 12:15:50.984236] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:37.764 [2024-07-15 12:15:50.992225] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:37.764 [2024-07-15 12:15:50.992243] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:37.764 [2024-07-15 12:15:50.992254] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:37.764 [2024-07-15 12:15:51.000246] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:37.764 [2024-07-15 12:15:51.000263] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:37.764 [2024-07-15 12:15:51.000279] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:37.764 [2024-07-15 12:15:51.077561] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:34:37.764 [2024-07-15 12:15:51.077605] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:34:37.764 [2024-07-15 12:15:51.077623] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:34:37.764 [2024-07-15 12:15:51.078899] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:34:37.764 [2024-07-15 12:15:51.078968] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:34:37.764 [2024-07-15 12:15:51.078985] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:34:37.764 [2024-07-15 12:15:51.079029] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:34:37.764 00:34:37.764 [2024-07-15 12:15:51.079048] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:34:38.022 00:34:38.022 real 0m3.184s 00:34:38.022 user 0m2.779s 00:34:38.022 sys 0m0.366s 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:34:38.022 ************************************ 00:34:38.022 END TEST bdev_hello_world 00:34:38.022 ************************************ 00:34:38.022 12:15:51 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:38.022 12:15:51 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:34:38.022 12:15:51 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:38.022 12:15:51 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:38.022 12:15:51 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:38.022 ************************************ 00:34:38.022 START TEST bdev_bounds 00:34:38.022 ************************************ 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1649289 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1649289' 00:34:38.022 Process bdevio pid: 1649289 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1649289 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1649289 ']' 00:34:38.022 12:15:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:38.023 12:15:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:38.023 12:15:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:38.023 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:38.023 12:15:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:38.023 12:15:51 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:38.281 [2024-07-15 12:15:51.645861] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:34:38.281 [2024-07-15 12:15:51.645924] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1649289 ] 00:34:38.281 [2024-07-15 12:15:51.763092] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:34:38.281 [2024-07-15 12:15:51.863027] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:38.281 [2024-07-15 12:15:51.863127] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:34:38.281 [2024-07-15 12:15:51.863128] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:38.540 [2024-07-15 12:15:51.884523] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:38.540 [2024-07-15 12:15:51.892542] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:38.540 [2024-07-15 12:15:51.900565] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:38.540 [2024-07-15 12:15:52.012098] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:41.073 [2024-07-15 12:15:54.230311] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:41.073 [2024-07-15 12:15:54.230389] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:41.073 [2024-07-15 12:15:54.230405] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:41.073 [2024-07-15 12:15:54.238327] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:41.073 [2024-07-15 12:15:54.238349] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:41.073 [2024-07-15 12:15:54.238361] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:41.073 [2024-07-15 12:15:54.246352] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:41.073 [2024-07-15 12:15:54.246370] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:41.073 [2024-07-15 12:15:54.246381] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:41.073 [2024-07-15 12:15:54.254371] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:41.073 [2024-07-15 12:15:54.254388] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:41.073 [2024-07-15 12:15:54.254399] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:41.073 12:15:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:41.073 12:15:54 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:34:41.073 12:15:54 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:34:41.073 I/O targets: 00:34:41.073 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:34:41.073 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:34:41.073 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:34:41.073 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:34:41.073 00:34:41.073 00:34:41.073 CUnit - A unit testing framework for C - Version 2.1-3 00:34:41.073 http://cunit.sourceforge.net/ 00:34:41.073 00:34:41.073 00:34:41.073 Suite: bdevio tests on: crypto_ram4 00:34:41.073 Test: blockdev write read block ...passed 00:34:41.073 Test: blockdev write zeroes read block ...passed 00:34:41.073 Test: blockdev write zeroes read no split ...passed 00:34:41.073 Test: blockdev write zeroes read split ...passed 00:34:41.073 Test: blockdev write zeroes read split partial ...passed 00:34:41.073 Test: blockdev reset ...passed 00:34:41.073 Test: blockdev write read 8 blocks ...passed 00:34:41.073 Test: blockdev write read size > 128k ...passed 00:34:41.073 Test: blockdev write read invalid size ...passed 00:34:41.073 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:41.073 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:41.073 Test: blockdev write read max offset ...passed 00:34:41.073 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:41.073 Test: blockdev writev readv 8 blocks ...passed 00:34:41.073 Test: blockdev writev readv 30 x 1block ...passed 00:34:41.073 Test: blockdev writev readv block ...passed 00:34:41.073 Test: blockdev writev readv size > 128k ...passed 00:34:41.073 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:41.073 Test: blockdev comparev and writev ...passed 00:34:41.073 Test: blockdev nvme passthru rw ...passed 00:34:41.073 Test: blockdev nvme passthru vendor specific ...passed 00:34:41.073 Test: blockdev nvme admin passthru ...passed 00:34:41.073 Test: blockdev copy ...passed 00:34:41.073 Suite: bdevio tests on: crypto_ram3 00:34:41.073 Test: blockdev write read block ...passed 00:34:41.073 Test: blockdev write zeroes read block ...passed 00:34:41.073 Test: blockdev write zeroes read no split ...passed 00:34:41.073 Test: blockdev write zeroes read split ...passed 00:34:41.073 Test: blockdev write zeroes read split partial ...passed 00:34:41.073 Test: blockdev reset ...passed 00:34:41.073 Test: blockdev write read 8 blocks ...passed 00:34:41.073 Test: blockdev write read size > 128k ...passed 00:34:41.073 Test: blockdev write read invalid size ...passed 00:34:41.073 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:41.073 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:41.073 Test: blockdev write read max offset ...passed 00:34:41.073 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:41.073 Test: blockdev writev readv 8 blocks ...passed 00:34:41.073 Test: blockdev writev readv 30 x 1block ...passed 00:34:41.073 Test: blockdev writev readv block ...passed 00:34:41.073 Test: blockdev writev readv size > 128k ...passed 00:34:41.073 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:41.073 Test: blockdev comparev and writev ...passed 00:34:41.073 Test: blockdev nvme passthru rw ...passed 00:34:41.073 Test: blockdev nvme passthru vendor specific ...passed 00:34:41.073 Test: blockdev nvme admin passthru ...passed 00:34:41.073 Test: blockdev copy ...passed 00:34:41.073 Suite: bdevio tests on: crypto_ram2 00:34:41.073 Test: blockdev write read block ...passed 00:34:41.073 Test: blockdev write zeroes read block ...passed 00:34:41.073 Test: blockdev write zeroes read no split ...passed 00:34:41.331 Test: blockdev write zeroes read split ...passed 00:34:41.590 Test: blockdev write zeroes read split partial ...passed 00:34:41.590 Test: blockdev reset ...passed 00:34:41.590 Test: blockdev write read 8 blocks ...passed 00:34:41.590 Test: blockdev write read size > 128k ...passed 00:34:41.590 Test: blockdev write read invalid size ...passed 00:34:41.590 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:41.590 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:41.590 Test: blockdev write read max offset ...passed 00:34:41.590 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:41.590 Test: blockdev writev readv 8 blocks ...passed 00:34:41.590 Test: blockdev writev readv 30 x 1block ...passed 00:34:41.590 Test: blockdev writev readv block ...passed 00:34:41.590 Test: blockdev writev readv size > 128k ...passed 00:34:41.590 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:41.590 Test: blockdev comparev and writev ...passed 00:34:41.590 Test: blockdev nvme passthru rw ...passed 00:34:41.590 Test: blockdev nvme passthru vendor specific ...passed 00:34:41.590 Test: blockdev nvme admin passthru ...passed 00:34:41.590 Test: blockdev copy ...passed 00:34:41.590 Suite: bdevio tests on: crypto_ram 00:34:41.590 Test: blockdev write read block ...passed 00:34:41.590 Test: blockdev write zeroes read block ...passed 00:34:41.590 Test: blockdev write zeroes read no split ...passed 00:34:41.590 Test: blockdev write zeroes read split ...passed 00:34:41.849 Test: blockdev write zeroes read split partial ...passed 00:34:41.849 Test: blockdev reset ...passed 00:34:41.849 Test: blockdev write read 8 blocks ...passed 00:34:41.849 Test: blockdev write read size > 128k ...passed 00:34:41.849 Test: blockdev write read invalid size ...passed 00:34:41.849 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:34:41.849 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:34:41.849 Test: blockdev write read max offset ...passed 00:34:41.849 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:34:41.849 Test: blockdev writev readv 8 blocks ...passed 00:34:41.849 Test: blockdev writev readv 30 x 1block ...passed 00:34:41.849 Test: blockdev writev readv block ...passed 00:34:41.849 Test: blockdev writev readv size > 128k ...passed 00:34:41.849 Test: blockdev writev readv size > 128k in two iovs ...passed 00:34:41.849 Test: blockdev comparev and writev ...passed 00:34:41.849 Test: blockdev nvme passthru rw ...passed 00:34:41.849 Test: blockdev nvme passthru vendor specific ...passed 00:34:41.849 Test: blockdev nvme admin passthru ...passed 00:34:41.849 Test: blockdev copy ...passed 00:34:41.849 00:34:41.849 Run Summary: Type Total Ran Passed Failed Inactive 00:34:41.849 suites 4 4 n/a 0 0 00:34:41.849 tests 92 92 92 0 0 00:34:41.849 asserts 520 520 520 0 n/a 00:34:41.849 00:34:41.849 Elapsed time = 1.636 seconds 00:34:41.849 0 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1649289 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1649289 ']' 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1649289 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1649289 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1649289' 00:34:41.849 killing process with pid 1649289 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1649289 00:34:41.849 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1649289 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:34:42.417 00:34:42.417 real 0m4.210s 00:34:42.417 user 0m11.250s 00:34:42.417 sys 0m0.580s 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:34:42.417 ************************************ 00:34:42.417 END TEST bdev_bounds 00:34:42.417 ************************************ 00:34:42.417 12:15:55 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:42.417 12:15:55 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:34:42.417 12:15:55 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:34:42.417 12:15:55 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:42.417 12:15:55 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:42.417 ************************************ 00:34:42.417 START TEST bdev_nbd 00:34:42.417 ************************************ 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1649839 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1649839 /var/tmp/spdk-nbd.sock 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1649839 ']' 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:34:42.417 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:34:42.417 12:15:55 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:42.417 [2024-07-15 12:15:55.959616] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:34:42.417 [2024-07-15 12:15:55.959699] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:42.676 [2024-07-15 12:15:56.090761] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:42.676 [2024-07-15 12:15:56.191987] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:42.676 [2024-07-15 12:15:56.213274] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:34:42.676 [2024-07-15 12:15:56.221296] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:34:42.676 [2024-07-15 12:15:56.229314] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:34:42.934 [2024-07-15 12:15:56.332582] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:34:45.468 [2024-07-15 12:15:58.559227] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:34:45.468 [2024-07-15 12:15:58.559296] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:34:45.468 [2024-07-15 12:15:58.559311] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.468 [2024-07-15 12:15:58.567245] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:34:45.468 [2024-07-15 12:15:58.567266] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:34:45.468 [2024-07-15 12:15:58.567278] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.468 [2024-07-15 12:15:58.575265] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:34:45.468 [2024-07-15 12:15:58.575283] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:34:45.468 [2024-07-15 12:15:58.575295] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.468 [2024-07-15 12:15:58.583285] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:34:45.468 [2024-07-15 12:15:58.583303] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:34:45.468 [2024-07-15 12:15:58.583314] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:45.468 1+0 records in 00:34:45.468 1+0 records out 00:34:45.468 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300911 s, 13.6 MB/s 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:45.468 12:15:58 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:45.727 1+0 records in 00:34:45.727 1+0 records out 00:34:45.727 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00028584 s, 14.3 MB/s 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:45.727 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:45.986 1+0 records in 00:34:45.986 1+0 records out 00:34:45.986 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000336088 s, 12.2 MB/s 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:45.986 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:45.987 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:45.987 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:45.987 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:45.987 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:46.245 1+0 records in 00:34:46.245 1+0 records out 00:34:46.245 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288459 s, 14.2 MB/s 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:34:46.245 12:15:59 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:34:46.503 { 00:34:46.503 "nbd_device": "/dev/nbd0", 00:34:46.503 "bdev_name": "crypto_ram" 00:34:46.503 }, 00:34:46.503 { 00:34:46.503 "nbd_device": "/dev/nbd1", 00:34:46.503 "bdev_name": "crypto_ram2" 00:34:46.503 }, 00:34:46.503 { 00:34:46.503 "nbd_device": "/dev/nbd2", 00:34:46.503 "bdev_name": "crypto_ram3" 00:34:46.503 }, 00:34:46.503 { 00:34:46.503 "nbd_device": "/dev/nbd3", 00:34:46.503 "bdev_name": "crypto_ram4" 00:34:46.503 } 00:34:46.503 ]' 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:34:46.503 { 00:34:46.503 "nbd_device": "/dev/nbd0", 00:34:46.503 "bdev_name": "crypto_ram" 00:34:46.503 }, 00:34:46.503 { 00:34:46.503 "nbd_device": "/dev/nbd1", 00:34:46.503 "bdev_name": "crypto_ram2" 00:34:46.503 }, 00:34:46.503 { 00:34:46.503 "nbd_device": "/dev/nbd2", 00:34:46.503 "bdev_name": "crypto_ram3" 00:34:46.503 }, 00:34:46.503 { 00:34:46.503 "nbd_device": "/dev/nbd3", 00:34:46.503 "bdev_name": "crypto_ram4" 00:34:46.503 } 00:34:46.503 ]' 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:46.503 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:46.761 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:47.021 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:47.279 12:16:00 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:47.539 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:47.797 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:47.797 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:47.797 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:34:48.055 /dev/nbd0 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:34:48.055 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:48.056 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:48.056 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:48.056 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:34:48.056 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:48.056 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:48.056 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:48.056 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:48.056 1+0 records in 00:34:48.056 1+0 records out 00:34:48.056 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000290302 s, 14.1 MB/s 00:34:48.056 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.314 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:48.314 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.314 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:48.314 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:48.314 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:48.314 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:48.314 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:34:48.314 /dev/nbd1 00:34:48.572 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:34:48.572 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:34:48.572 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:48.573 1+0 records in 00:34:48.573 1+0 records out 00:34:48.573 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00034371 s, 11.9 MB/s 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:48.573 12:16:01 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:34:48.830 /dev/nbd10 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:48.830 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:48.831 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:48.831 1+0 records in 00:34:48.831 1+0 records out 00:34:48.831 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313941 s, 13.0 MB/s 00:34:48.831 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.831 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:48.831 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:48.831 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:48.831 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:48.831 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:48.831 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:48.831 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:34:48.831 /dev/nbd11 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:34:49.089 1+0 records in 00:34:49.089 1+0 records out 00:34:49.089 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000345479 s, 11.9 MB/s 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:49.089 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:34:49.348 { 00:34:49.348 "nbd_device": "/dev/nbd0", 00:34:49.348 "bdev_name": "crypto_ram" 00:34:49.348 }, 00:34:49.348 { 00:34:49.348 "nbd_device": "/dev/nbd1", 00:34:49.348 "bdev_name": "crypto_ram2" 00:34:49.348 }, 00:34:49.348 { 00:34:49.348 "nbd_device": "/dev/nbd10", 00:34:49.348 "bdev_name": "crypto_ram3" 00:34:49.348 }, 00:34:49.348 { 00:34:49.348 "nbd_device": "/dev/nbd11", 00:34:49.348 "bdev_name": "crypto_ram4" 00:34:49.348 } 00:34:49.348 ]' 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:34:49.348 { 00:34:49.348 "nbd_device": "/dev/nbd0", 00:34:49.348 "bdev_name": "crypto_ram" 00:34:49.348 }, 00:34:49.348 { 00:34:49.348 "nbd_device": "/dev/nbd1", 00:34:49.348 "bdev_name": "crypto_ram2" 00:34:49.348 }, 00:34:49.348 { 00:34:49.348 "nbd_device": "/dev/nbd10", 00:34:49.348 "bdev_name": "crypto_ram3" 00:34:49.348 }, 00:34:49.348 { 00:34:49.348 "nbd_device": "/dev/nbd11", 00:34:49.348 "bdev_name": "crypto_ram4" 00:34:49.348 } 00:34:49.348 ]' 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:34:49.348 /dev/nbd1 00:34:49.348 /dev/nbd10 00:34:49.348 /dev/nbd11' 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:34:49.348 /dev/nbd1 00:34:49.348 /dev/nbd10 00:34:49.348 /dev/nbd11' 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:34:49.348 256+0 records in 00:34:49.348 256+0 records out 00:34:49.348 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0106786 s, 98.2 MB/s 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:34:49.348 256+0 records in 00:34:49.348 256+0 records out 00:34:49.348 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0613478 s, 17.1 MB/s 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:34:49.348 256+0 records in 00:34:49.348 256+0 records out 00:34:49.348 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0470314 s, 22.3 MB/s 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:49.348 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:34:49.607 256+0 records in 00:34:49.607 256+0 records out 00:34:49.607 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0420336 s, 24.9 MB/s 00:34:49.607 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:34:49.607 12:16:02 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:34:49.607 256+0 records in 00:34:49.607 256+0 records out 00:34:49.607 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0491639 s, 21.3 MB/s 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:49.607 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:50.173 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:50.431 12:16:03 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:50.690 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:50.948 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:34:51.207 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:34:51.464 malloc_lvol_verify 00:34:51.464 12:16:04 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:34:51.722 6e6cb2c6-2f23-43bd-a389-fcd3e1cc5991 00:34:51.722 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:34:51.980 aabb7ced-5998-4b88-b12d-ae85b5f1be05 00:34:51.980 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:34:52.237 /dev/nbd0 00:34:52.237 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:34:52.237 mke2fs 1.46.5 (30-Dec-2021) 00:34:52.237 Discarding device blocks: 0/4096 done 00:34:52.237 Creating filesystem with 4096 1k blocks and 1024 inodes 00:34:52.237 00:34:52.237 Allocating group tables: 0/1 done 00:34:52.237 Writing inode tables: 0/1 done 00:34:52.237 Creating journal (1024 blocks): done 00:34:52.237 Writing superblocks and filesystem accounting information: 0/1 done 00:34:52.237 00:34:52.237 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:34:52.237 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:34:52.237 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:34:52.237 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:34:52.237 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:34:52.237 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:34:52.237 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:34:52.237 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1649839 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1649839 ']' 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1649839 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1649839 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1649839' 00:34:52.496 killing process with pid 1649839 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1649839 00:34:52.496 12:16:05 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1649839 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:34:53.065 00:34:53.065 real 0m10.612s 00:34:53.065 user 0m13.717s 00:34:53.065 sys 0m4.241s 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:34:53.065 ************************************ 00:34:53.065 END TEST bdev_nbd 00:34:53.065 ************************************ 00:34:53.065 12:16:06 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:34:53.065 12:16:06 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:34:53.065 12:16:06 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:34:53.065 12:16:06 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:34:53.065 12:16:06 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:34:53.065 12:16:06 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:34:53.065 12:16:06 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:53.065 12:16:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:34:53.065 ************************************ 00:34:53.065 START TEST bdev_fio 00:34:53.065 ************************************ 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:34:53.065 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:34:53.065 12:16:06 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:34:53.325 ************************************ 00:34:53.325 START TEST bdev_fio_rw_verify 00:34:53.325 ************************************ 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:34:53.325 12:16:06 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:34:53.584 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:53.584 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:53.584 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:53.584 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:34:53.584 fio-3.35 00:34:53.584 Starting 4 threads 00:35:08.473 00:35:08.473 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1651879: Mon Jul 15 12:16:19 2024 00:35:08.473 read: IOPS=18.3k, BW=71.3MiB/s (74.8MB/s)(713MiB/10001msec) 00:35:08.473 slat (usec): min=10, max=1249, avg=72.31, stdev=52.58 00:35:08.473 clat (usec): min=10, max=1875, avg=388.49, stdev=327.10 00:35:08.473 lat (usec): min=43, max=2042, avg=460.80, stdev=367.28 00:35:08.473 clat percentiles (usec): 00:35:08.473 | 50.000th=[ 260], 99.000th=[ 1418], 99.900th=[ 1516], 99.990th=[ 1565], 00:35:08.473 | 99.999th=[ 1713] 00:35:08.473 write: IOPS=20.2k, BW=79.1MiB/s (82.9MB/s)(770MiB/9729msec); 0 zone resets 00:35:08.473 slat (usec): min=17, max=506, avg=88.83, stdev=56.31 00:35:08.473 clat (usec): min=24, max=2361, avg=482.24, stdev=394.60 00:35:08.473 lat (usec): min=63, max=2596, avg=571.07, stdev=438.14 00:35:08.473 clat percentiles (usec): 00:35:08.473 | 50.000th=[ 338], 99.000th=[ 1778], 99.900th=[ 1942], 99.990th=[ 2245], 00:35:08.473 | 99.999th=[ 2376] 00:35:08.473 bw ( KiB/s): min=64138, max=98136, per=95.71%, avg=77522.63, stdev=2347.26, samples=76 00:35:08.473 iops : min=16034, max=24534, avg=19380.63, stdev=586.83, samples=76 00:35:08.473 lat (usec) : 20=0.01%, 50=1.15%, 100=8.29%, 250=31.80%, 500=27.71% 00:35:08.473 lat (usec) : 750=12.17%, 1000=9.54% 00:35:08.473 lat (msec) : 2=9.30%, 4=0.04% 00:35:08.473 cpu : usr=99.55%, sys=0.00%, ctx=52, majf=0, minf=262 00:35:08.473 IO depths : 1=10.6%, 2=25.4%, 4=50.9%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:08.473 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:08.473 complete : 0=0.0%, 4=88.8%, 8=11.2%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:08.473 issued rwts: total=182617,197008,0,0 short=0,0,0,0 dropped=0,0,0,0 00:35:08.473 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:08.473 00:35:08.473 Run status group 0 (all jobs): 00:35:08.473 READ: bw=71.3MiB/s (74.8MB/s), 71.3MiB/s-71.3MiB/s (74.8MB/s-74.8MB/s), io=713MiB (748MB), run=10001-10001msec 00:35:08.473 WRITE: bw=79.1MiB/s (82.9MB/s), 79.1MiB/s-79.1MiB/s (82.9MB/s-82.9MB/s), io=770MiB (807MB), run=9729-9729msec 00:35:08.473 00:35:08.473 real 0m13.498s 00:35:08.473 user 0m45.849s 00:35:08.473 sys 0m0.475s 00:35:08.473 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:08.473 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:35:08.473 ************************************ 00:35:08.473 END TEST bdev_fio_rw_verify 00:35:08.473 ************************************ 00:35:08.473 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:35:08.473 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:35:08.473 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:08.473 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:35:08.473 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "8ca3478e-8424-5f0e-9080-13cc5d315298"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8ca3478e-8424-5f0e-9080-13cc5d315298",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "00c3603b-7cd9-5213-bb6a-d705a2657c53"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "00c3603b-7cd9-5213-bb6a-d705a2657c53",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "14e4c5a7-ddc7-5902-9158-96f87e78c12b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "14e4c5a7-ddc7-5902-9158-96f87e78c12b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "c9b85c73-7e99-596d-bba1-b1150239e0ac"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c9b85c73-7e99-596d-bba1-b1150239e0ac",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:35:08.474 crypto_ram2 00:35:08.474 crypto_ram3 00:35:08.474 crypto_ram4 ]] 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "8ca3478e-8424-5f0e-9080-13cc5d315298"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "8ca3478e-8424-5f0e-9080-13cc5d315298",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "00c3603b-7cd9-5213-bb6a-d705a2657c53"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "00c3603b-7cd9-5213-bb6a-d705a2657c53",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "14e4c5a7-ddc7-5902-9158-96f87e78c12b"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "14e4c5a7-ddc7-5902-9158-96f87e78c12b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "c9b85c73-7e99-596d-bba1-b1150239e0ac"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "c9b85c73-7e99-596d-bba1-b1150239e0ac",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:35:08.474 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:08.475 ************************************ 00:35:08.475 START TEST bdev_fio_trim 00:35:08.475 ************************************ 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:08.475 12:16:20 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:08.475 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:08.475 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:08.475 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:08.475 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:08.475 fio-3.35 00:35:08.475 Starting 4 threads 00:35:20.753 00:35:20.753 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1653727: Mon Jul 15 12:16:33 2024 00:35:20.753 write: IOPS=30.2k, BW=118MiB/s (124MB/s)(1179MiB/10001msec); 0 zone resets 00:35:20.753 slat (usec): min=17, max=469, avg=75.86, stdev=38.75 00:35:20.753 clat (usec): min=55, max=2199, avg=334.42, stdev=199.84 00:35:20.753 lat (usec): min=80, max=2668, avg=410.28, stdev=223.08 00:35:20.753 clat percentiles (usec): 00:35:20.753 | 50.000th=[ 289], 99.000th=[ 1012], 99.900th=[ 1221], 99.990th=[ 1418], 00:35:20.753 | 99.999th=[ 2073] 00:35:20.753 bw ( KiB/s): min=100456, max=185128, per=100.00%, avg=121588.57, stdev=6951.19, samples=77 00:35:20.753 iops : min=25114, max=46282, avg=30397.04, stdev=1737.80, samples=77 00:35:20.753 trim: IOPS=30.2k, BW=118MiB/s (124MB/s)(1179MiB/10001msec); 0 zone resets 00:35:20.753 slat (usec): min=5, max=400, avg=21.28, stdev= 9.54 00:35:20.753 clat (usec): min=44, max=1732, avg=315.55, stdev=146.62 00:35:20.753 lat (usec): min=54, max=1746, avg=336.83, stdev=150.50 00:35:20.753 clat percentiles (usec): 00:35:20.753 | 50.000th=[ 293], 99.000th=[ 717], 99.900th=[ 824], 99.990th=[ 979], 00:35:20.753 | 99.999th=[ 1483] 00:35:20.753 bw ( KiB/s): min=100456, max=185152, per=100.00%, avg=121589.81, stdev=6951.70, samples=77 00:35:20.753 iops : min=25114, max=46288, avg=30397.35, stdev=1737.93, samples=77 00:35:20.753 lat (usec) : 50=0.02%, 100=3.26%, 250=36.48%, 500=45.71%, 750=11.83% 00:35:20.753 lat (usec) : 1000=2.15% 00:35:20.753 lat (msec) : 2=0.56%, 4=0.01% 00:35:20.753 cpu : usr=99.58%, sys=0.01%, ctx=80, majf=0, minf=101 00:35:20.753 IO depths : 1=7.3%, 2=26.5%, 4=53.0%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:35:20.753 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:20.753 complete : 0=0.0%, 4=88.3%, 8=11.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:35:20.753 issued rwts: total=0,301870,301871,0 short=0,0,0,0 dropped=0,0,0,0 00:35:20.753 latency : target=0, window=0, percentile=100.00%, depth=8 00:35:20.753 00:35:20.753 Run status group 0 (all jobs): 00:35:20.753 WRITE: bw=118MiB/s (124MB/s), 118MiB/s-118MiB/s (124MB/s-124MB/s), io=1179MiB (1236MB), run=10001-10001msec 00:35:20.753 TRIM: bw=118MiB/s (124MB/s), 118MiB/s-118MiB/s (124MB/s-124MB/s), io=1179MiB (1236MB), run=10001-10001msec 00:35:20.753 00:35:20.753 real 0m13.669s 00:35:20.753 user 0m45.977s 00:35:20.753 sys 0m0.537s 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:35:20.753 ************************************ 00:35:20.753 END TEST bdev_fio_trim 00:35:20.753 ************************************ 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:35:20.753 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:35:20.753 00:35:20.753 real 0m27.544s 00:35:20.753 user 1m32.021s 00:35:20.753 sys 0m1.219s 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:20.753 ************************************ 00:35:20.753 END TEST bdev_fio 00:35:20.753 ************************************ 00:35:20.753 12:16:34 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:35:20.753 12:16:34 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:35:20.753 12:16:34 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:20.753 12:16:34 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:20.753 12:16:34 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:20.753 12:16:34 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:35:20.753 ************************************ 00:35:20.753 START TEST bdev_verify 00:35:20.753 ************************************ 00:35:20.753 12:16:34 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:35:20.753 [2024-07-15 12:16:34.279454] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:35:20.753 [2024-07-15 12:16:34.279521] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1655113 ] 00:35:21.012 [2024-07-15 12:16:34.409787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:21.012 [2024-07-15 12:16:34.507954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:21.012 [2024-07-15 12:16:34.507960] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:21.012 [2024-07-15 12:16:34.529318] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:35:21.012 [2024-07-15 12:16:34.537342] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:21.012 [2024-07-15 12:16:34.545360] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:21.271 [2024-07-15 12:16:34.646648] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:35:23.840 [2024-07-15 12:16:36.867262] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:35:23.840 [2024-07-15 12:16:36.867344] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:23.840 [2024-07-15 12:16:36.867358] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:23.840 [2024-07-15 12:16:36.875275] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:35:23.840 [2024-07-15 12:16:36.875294] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:23.840 [2024-07-15 12:16:36.875306] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:23.840 [2024-07-15 12:16:36.883299] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:35:23.840 [2024-07-15 12:16:36.883318] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:23.840 [2024-07-15 12:16:36.883329] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:23.840 [2024-07-15 12:16:36.891322] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:35:23.840 [2024-07-15 12:16:36.891339] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:23.840 [2024-07-15 12:16:36.891350] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:23.840 Running I/O for 5 seconds... 00:35:29.111 00:35:29.111 Latency(us) 00:35:29.111 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:29.111 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:29.111 Verification LBA range: start 0x0 length 0x1000 00:35:29.112 crypto_ram : 5.06 480.40 1.88 0.00 0.00 265892.68 5356.86 160477.72 00:35:29.112 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:29.112 Verification LBA range: start 0x1000 length 0x1000 00:35:29.112 crypto_ram : 5.08 388.94 1.52 0.00 0.00 327318.51 3077.34 200597.15 00:35:29.112 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:29.112 Verification LBA range: start 0x0 length 0x1000 00:35:29.112 crypto_ram2 : 5.06 480.30 1.88 0.00 0.00 265151.18 5442.34 148624.25 00:35:29.112 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:29.112 Verification LBA range: start 0x1000 length 0x1000 00:35:29.112 crypto_ram2 : 5.08 391.92 1.53 0.00 0.00 323966.27 3675.71 182361.04 00:35:29.112 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:29.112 Verification LBA range: start 0x0 length 0x1000 00:35:29.112 crypto_ram3 : 5.05 3727.00 14.56 0.00 0.00 34033.06 7351.43 25986.45 00:35:29.112 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:29.112 Verification LBA range: start 0x1000 length 0x1000 00:35:29.112 crypto_ram3 : 5.06 3013.25 11.77 0.00 0.00 41996.67 9346.00 30317.52 00:35:29.112 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:35:29.112 Verification LBA range: start 0x0 length 0x1000 00:35:29.112 crypto_ram4 : 5.05 3734.01 14.59 0.00 0.00 33887.93 1674.02 25530.55 00:35:29.112 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:35:29.112 Verification LBA range: start 0x1000 length 0x1000 00:35:29.112 crypto_ram4 : 5.07 3031.93 11.84 0.00 0.00 41679.20 2222.53 30089.57 00:35:29.112 =================================================================================================================== 00:35:29.112 Total : 15247.75 59.56 0.00 0.00 66693.32 1674.02 200597.15 00:35:29.112 00:35:29.112 real 0m8.270s 00:35:29.112 user 0m15.676s 00:35:29.112 sys 0m0.367s 00:35:29.112 12:16:42 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:29.112 12:16:42 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:35:29.112 ************************************ 00:35:29.112 END TEST bdev_verify 00:35:29.112 ************************************ 00:35:29.112 12:16:42 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:35:29.112 12:16:42 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:29.112 12:16:42 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:35:29.112 12:16:42 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:29.112 12:16:42 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:35:29.112 ************************************ 00:35:29.112 START TEST bdev_verify_big_io 00:35:29.112 ************************************ 00:35:29.112 12:16:42 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:35:29.112 [2024-07-15 12:16:42.641192] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:35:29.112 [2024-07-15 12:16:42.641262] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1656139 ] 00:35:29.370 [2024-07-15 12:16:42.773022] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:35:29.370 [2024-07-15 12:16:42.878470] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:29.370 [2024-07-15 12:16:42.878474] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:29.370 [2024-07-15 12:16:42.899873] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:35:29.370 [2024-07-15 12:16:42.907899] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:29.370 [2024-07-15 12:16:42.915919] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:29.628 [2024-07-15 12:16:43.020195] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:35:32.161 [2024-07-15 12:16:45.258925] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:35:32.161 [2024-07-15 12:16:45.259016] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:32.161 [2024-07-15 12:16:45.259036] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:32.161 [2024-07-15 12:16:45.266940] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:35:32.161 [2024-07-15 12:16:45.266963] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:32.161 [2024-07-15 12:16:45.266976] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:32.161 [2024-07-15 12:16:45.274962] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:35:32.162 [2024-07-15 12:16:45.274980] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:32.162 [2024-07-15 12:16:45.274992] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:32.162 [2024-07-15 12:16:45.282983] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:35:32.162 [2024-07-15 12:16:45.283000] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:32.162 [2024-07-15 12:16:45.283012] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:32.162 Running I/O for 5 seconds... 00:35:32.754 [2024-07-15 12:16:46.222545] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:32.754 [2024-07-15 12:16:46.223492] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:32.754 [2024-07-15 12:16:46.223591] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:32.754 [2024-07-15 12:16:46.223652] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:35:32.754 [2024-07-15 12:16:46.225301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.754 [2024-07-15 12:16:46.225373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.754 [2024-07-15 12:16:46.225428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.754 [2024-07-15 12:16:46.225480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.754 [2024-07-15 12:16:46.225974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.754 [2024-07-15 12:16:46.226037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.754 [2024-07-15 12:16:46.226088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.226139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.227873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.227959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.228011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.228062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.228617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.228675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.228733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.228792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.230247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.230322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.230374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.230425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.230968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.231027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.231078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.231137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.232981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.233050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.233104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.233156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.233641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.233716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.233772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.233824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.235333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.235394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.235446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.235497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.236055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.236119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.236176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.236226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.237881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.237945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.237997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.238050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.238537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.238594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.238645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.238703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.240298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.240359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.240417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.240481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.240971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.241029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.241080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.241131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.242970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.243041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.243092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.243143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.243662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.243732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.243786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.243843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.245264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.245330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.245381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.245432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.246002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.246061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.246112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.246170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.248206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.248268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.248319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.248369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.248927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.248989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.249043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.249099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.250555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.250616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.250667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.250724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.251258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.251321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.251378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.251429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.253268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.253334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.253392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.253446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.253939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.254002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.254053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.254103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.255680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.255749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.255799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.255856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.256337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.256398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.256450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.256500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.258105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.258169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.258226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.258277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.258853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.258917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.258968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.259019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.260624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.260700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.260771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.260824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.261342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.261398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.261450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.261500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.263273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.263340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.263391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.263442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.263989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.264052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.264110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.264184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.265633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.265705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.265757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.265808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.266334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.266390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.266445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.266504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.268556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.268618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.268669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.268727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.269285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.269345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.269401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.269451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.270953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.271015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.271066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.271116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.271630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.271709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.271762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.271813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.273719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.273786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.273836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.273893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.274393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.274455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.274506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.274556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.276037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.276098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.276149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.276200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.276706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.276766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.276835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.276887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.278575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.278644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.278711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.278766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.279274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.279331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.279382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.279433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.281022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.281084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.281142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.755 [2024-07-15 12:16:46.281194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.281678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.281744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.281795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.281846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.283479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.283549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.283605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.283655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.284188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.284251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.284302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.284352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.285978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.286056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.286123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.286173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.286666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.286731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.286783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.286834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.288449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.288519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.288571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.288622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.289153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.289212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.289265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.289325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.290869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.290936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.290987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.291038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.291559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.291617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.291671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.291731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.293388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.293455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.293507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.293557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.294094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.294169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.294221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.294272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.295669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.295737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.295788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.295839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.296396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.296454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.296505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.296567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.298175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.298239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.298290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.298343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.298840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.298899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.298951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.299007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.300424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.300485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.300537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.300588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.301203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.301262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.301332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.301383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.302915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.304713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.306042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.307790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.308311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.310097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.310594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.311525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.314379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.316126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.317900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.319674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.320786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.322537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.324321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.326100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.329289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.330075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.330560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.332353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.334574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.335904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.337623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.339406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.342500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.344319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.346125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:32.756 [2024-07-15 12:16:46.347716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.014 [2024-07-15 12:16:46.349952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.014 [2024-07-15 12:16:46.351712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.352971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.353460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.356681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.358224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.359979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.361781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.362714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.363727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.365471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.367236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.370506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.372302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.372809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.373520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.375803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.377591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.378921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.380661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.382691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.384492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.386256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.388033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.390243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.392026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.393804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.394472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.397804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.399382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.401146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.402928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.404725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.405224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.406859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.408661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.411835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.413640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.415211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.415705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.417903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.419676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.421481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.423157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.425115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.426008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.427755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.428676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.430589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.432340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.434117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.435904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.439088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.440899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.442717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.444417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.446674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.448483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.448980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.449471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.451487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.451999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.452499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.453033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.454202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.454712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.455205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.455704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.458187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.458701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.459194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.459689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.460822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.461323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.461825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.462312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.464668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.465178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.465690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.466183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.467200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.467711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.468216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.468716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.470840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.471342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.471841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.472344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.473475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.473985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.474482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.474974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.477209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.477722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.478219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.478718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.479800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.480299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.480796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.481290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.483735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.484238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.484734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.485228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.486375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.486888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.487406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.487902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.490356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.490867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.491359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.491857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.492878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.493378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.493878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.494386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.496476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.496990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.497489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.498009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.499109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.499609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.500107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.500596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.503172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.503682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.504183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.504677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.505745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.507425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.509049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.509919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.512112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.513646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.515449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.515947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.518297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.520301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.521395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.523202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.525719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.527473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.529247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.531157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.533410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.535215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.536874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.537374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.540874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.015 [2024-07-15 12:16:46.542019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.543766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.545534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.546461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.547131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.548884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.550637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.553909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.555917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.556417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.556977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.559231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.561250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.562356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.564106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.566181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.567940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.569720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.571707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.574045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.575838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.577834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.578334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.581853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.582979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.584712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.586477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.587656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.588164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.589964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.591721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.594834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.596814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.597766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.598263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.600479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.602434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.603797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.605524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.607385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.016 [2024-07-15 12:16:46.608777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.610530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.612344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.614515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.616333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.618335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.619485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.622802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.624602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.626272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.628077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.630041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.630540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.631825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.633556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.636776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.638541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.640282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.278 [2024-07-15 12:16:46.640779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.643081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.644860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.646847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.648201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.650157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.650825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.652559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.654284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.655827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.657575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.659333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.661232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.664865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.666868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.667967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.668021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.669769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.670152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.672283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.672796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.673281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.675014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.676550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.676619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.676671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.676730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.677106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.679015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.679087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.679139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.679190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.681126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.681189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.681241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.681296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.681629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.681822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.681879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.681930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.681980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.683532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.683595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.683646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.683713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.684049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.684230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.684285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.684336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.684386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.686010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.686079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.686132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.686187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.686558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.686754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.686812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.686863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.686914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.688479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.688553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.688605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.688661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.689004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.689183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.689243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.689294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.689348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.691024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.691092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.691143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.691194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.691599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.691787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.691845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.691905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.691961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.693428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.693497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.693549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.693599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.693943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.694121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.694176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.694236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.694293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.696271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.696334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.696385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.696440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.696825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.697001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.697069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.697123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.697173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.698646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.698718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.698771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.698822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.699155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.699331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.699399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.699452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.699504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.701485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.701547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.701598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.701649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.702050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.702228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.702289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.702340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.702390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.703930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.703992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.704049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.704100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.704437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.704617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.279 [2024-07-15 12:16:46.704673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.704734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.704801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.706531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.706594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.706646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.706716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.707050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.707231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.707286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.707337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.707388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.708937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.709000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.709051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.709104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.709440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.709617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.709742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.709799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.709851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.711415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.711479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.711535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.711587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.711931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.712107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.712169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.712223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.712283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.713839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.713910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.713961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.714012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.714433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.714610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.714665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.714724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.714778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.716397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.716463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.716514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.716569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.716915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.717091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.717155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.717208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.717260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.718801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.718864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.718916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.718966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.719379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.719553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.719615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.719668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.719727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.721369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.721432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.721483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.721534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.721875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.722055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.722112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.722164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.722215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.723618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.723680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.723738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.723789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.724274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.724457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.724513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.724577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.724628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.726168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.726231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.726282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.726344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.726675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.726863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.726919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.726984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.727037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.728474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.728537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.728588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.728647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.729181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.729363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.729424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.729477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.729528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.731060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.731125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.731186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.731238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.731627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.731820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.731881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.731931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.731982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.733511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.733582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.733635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.733695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.734216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.734391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.734446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.734498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.734549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.736075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.736141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.736192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.736247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.736693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.736872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.736928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.736984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.737035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.738525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.738588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.738640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.280 [2024-07-15 12:16:46.738713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.739194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.739366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.739425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.739479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.739530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.740966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.741028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.741079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.741131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.741583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.741773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.741830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.741881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.741932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.743460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.743523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.743575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.743627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.744066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.744244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.744299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.744351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.744403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.745826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.745889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.745945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.746001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.746406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.746581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.746637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.746696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.746754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.748171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.748234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.748285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.748337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.748800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.748976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.749035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.749098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.749158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.750673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.750744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.750795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.750851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.751184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.751363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.751419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.751490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.751541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.753197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.753260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.753318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.753372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.753714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.753894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.753960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.754013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.754068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.755507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.755570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.756070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.756386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.756563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.756620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.756671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.756732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.758822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.759331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.759834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.760326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.760815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.760995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.761503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.762003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.762497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.764652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.765165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.765658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.766168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.766734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.767336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.767851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.768345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.768842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.771026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.771539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.772044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.772536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.773050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.773659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.774158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.774652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.775154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.777482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.777991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.778503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.779008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.779490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.780108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.780609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.781117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.781604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.783998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.784503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.784999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.785494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.785960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.786574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.787084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.787581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.788079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.790430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.790943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.791440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.791936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.792325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.792950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.793447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.793946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.281 [2024-07-15 12:16:46.794455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.796711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.797213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.797714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.798210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.798753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.799358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.799866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.800362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.800858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.803188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.803700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.804194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.804706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.805200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.805816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.806341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.806848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.807344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.809677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.811585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.812994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.813509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.813953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.815202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.816962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.817462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.817959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.819917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.821416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.823170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.824981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.825387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.827257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.829067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.829566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.830486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.833330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.835075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.836848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.838610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.839148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.839764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.841526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.843288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.845066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.848231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.849102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.849595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.851412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.851766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.853645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.855061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.856795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.858564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.861417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.863008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.864783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.866593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.866944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.282 [2024-07-15 12:16:46.868869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.870647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.872148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.872642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.875909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.877251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.878981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.880794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.881135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.881751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.882563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.884302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.886060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.889335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.891090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.891637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.892137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.892476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.894366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.896141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.897464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.899199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.901231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.902956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.904764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.906552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.906904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.908837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.910616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.912392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.913341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.916582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.918400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.920087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.921868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.922243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.923664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.924172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.925736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.927514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.930640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.932457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.934152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.934670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.935106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.936961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.938750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.940506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.942046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.944078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.944717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.946459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.948239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.948615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.950071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.951825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.953608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.955411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.958931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.960716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.962057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.963799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.964183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.966083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.966811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.967309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.969115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.972278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.974050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.541 [2024-07-15 12:16:46.975815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.976967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.977394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.978952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.980706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.982503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.984209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.987311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.987823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.988893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.990617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.991047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.992938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.994431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.996177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:46.997982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.001598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.003380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.005156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.006486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.006905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.008816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.010588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.011093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.011608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.014387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.016143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.017903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.019673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.020064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.020673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.022485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.024304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.026071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.029233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.030553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.031054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.032504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.032884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.034800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.036597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.038400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.040215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.042658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.044416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.046188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.047952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.048337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.050208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.051989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.053787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.054311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.057544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.058859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.060603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.062355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.062731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.063392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.063904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.065649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.067416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.070571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.072360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.073246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.073756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.074099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.076017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.077789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.079175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.080914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.082802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.084190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.085934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.087739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.088081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.090015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.091821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.093572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.094867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.098240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.100009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.101379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.103136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.103509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.105426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.105931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.106759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.108508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.111746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.113522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.115288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.115791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.116322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.117867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.119624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.121425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.123212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.126326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.126844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.127919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.127978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.128397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.130293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.131425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.133177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.542 [2024-07-15 12:16:47.134944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.138159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.138229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.138302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.138354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.138759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.139453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.139527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.139579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.139642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.141225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.141290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.141344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.141397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.141739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.141918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.141987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.142054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.142119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.143709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.143774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.143838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.143904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.144311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.144509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.144567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.144619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.144670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.146756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.146840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.146904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.146971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.147458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.147645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.147712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.147777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.147830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.149642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.149719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.149775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.149827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.150335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.150519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.150585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.150649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.150723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.152431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.152493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.152557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.152613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.153111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.153292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.153361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.153425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.153481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.155499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.155573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.155638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.155714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.156262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.156443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.156498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.156573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.156638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.158523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.158586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.158640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.158699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.159279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.159475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.159534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.159599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.159667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.161396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.161459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.161523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.161578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.162082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.162262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.162338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.162402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.162458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.164566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.164645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.164715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.804 [2024-07-15 12:16:47.164784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.165277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.165455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.165521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.165585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.165637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.167508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.167570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.167625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.167677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.168220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.168404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.168463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.168526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.168597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.170312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.170375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.170440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.170502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.170980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.171162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.171242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.171308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.171360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.173470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.173552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.173615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.173690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.174196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.174377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.174442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.174508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.174561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.176376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.176441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.176493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.176549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.177122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.177302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.177372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.177425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.177499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.179207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.179277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.179341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.179413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.179835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.180018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.180086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.180139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.180191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.182128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.182204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.182269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.182332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.182801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.182982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.183038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.183105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.183170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.185043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.185108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.185162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.185216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.185783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.185970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.186043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.186110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.186172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.187888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.187950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.188002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.188066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.188553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.188755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.188839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.188906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.188959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.190828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.190903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.190974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.191027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.191507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.191701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.191758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.191832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.191898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.193761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.193824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.193877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.193931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.194483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.194665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.194740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.194806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.194859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.196595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.196658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.196717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.196781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.197260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.197442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.197514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.197590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.197654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.199476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.199555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.199628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.199701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.200166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.200347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.200402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.200467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.200533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.202379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.202442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.202494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.202549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.203086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.203269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.203339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.203395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.203458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.205183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.805 [2024-07-15 12:16:47.205246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.205297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.205348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.205832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.206013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.206101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.206155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.206224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.207809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.207874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.207929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.207981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.208365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.208540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.208600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.208651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.208710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.210220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.210288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.210342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.210397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.210738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.210916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.210971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.211022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.211073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.212592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.212662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.212734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.212788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.213125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.213305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.213361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.213413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.213465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.215030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.215094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.215146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.215218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.215777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.215959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.216019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.216070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.216126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.217828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.217891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.217943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.217994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.218412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.218588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.218647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.218706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.218764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.220222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.220283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.220342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.220396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.220736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.220914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.220968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.221020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.221070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.222593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.222679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.222739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.222795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.223133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.223309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.223364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.223415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.223465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.225087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.225158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.225231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.227220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.227637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.227820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.227878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.227930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.227982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.229629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.231647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.232881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.234629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.235026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.235201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.237192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.237721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.238216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.241000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.242769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.244529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.246507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.246951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.247558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.249165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.250966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.252894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.256305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.257488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.257985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.259324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.259733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.261620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.263357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.265156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.266954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.269590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.271340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.273080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.274996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.275405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.277244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.279052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.280695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.281200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.284746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.285851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.287594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.289366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.806 [2024-07-15 12:16:47.289713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.290321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.290836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.292594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.294369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.297674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.299674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.300177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.300673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.301013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.302909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.304904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.306017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.307763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.309814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.311566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.313367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.315367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.315773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.317688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.319448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.321434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.322248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.325556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.327273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.329085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.330892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.331232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.332568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.333069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.334381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.336133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.339331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.341457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.342670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.343170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.343538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.345383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.347138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.348937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.350618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.352565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.353355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.355091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.356854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.357199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.358527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.360277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.362066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.363731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.367521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.369522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.370631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.372391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.372819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.374874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.375378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.375892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.377639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.380927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.382711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.384709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.385213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.385725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.387644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.389416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.391417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.392517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.394799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:33.807 [2024-07-15 12:16:47.395308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.397024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.398833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.399184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.400729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.402521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.404275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.406271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.409619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.411404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.413078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.414880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.415219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.417361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.418610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.419109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.420357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.423226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.424963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.426344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.428137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.428614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.429231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.431041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.432826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.434814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.438300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.439157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.439654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.441296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.441679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.443596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.445059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.446857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.448609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.451406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.453146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.454902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.456695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.457037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.458954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.461067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.462288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.462786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.466304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.467818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.469556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.471399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.471747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.472357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.473257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.474995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.476759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.480108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.481914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.482412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.483072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.483470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.485379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.487374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.488533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.490264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.492296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.494044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.495806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.497804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.498277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.500132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.501907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.503890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.504389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.507992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.509157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.510890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.512647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.512993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.513927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.514428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.514928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.515417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.519040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.520535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.522293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.522799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.523333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.525256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.527065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.528680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.530244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.532456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.532971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.533470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.533969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.534371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.534992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.535492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.535996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.536504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.538662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.070 [2024-07-15 12:16:47.539172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.539668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.540174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.540613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.541235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.541792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.542295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.542796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.545009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.545518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.546021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.546515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.546933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.547552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.548064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.548559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.549059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.551117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.551625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.552130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.552626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.553093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.553713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.554216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.554718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.555209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.557451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.557968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.559122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.560002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.560351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.560972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.561474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.562913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.563514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.566578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.567173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.568648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.569154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.569590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.571515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.572432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.573548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.574046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.577078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.577582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.578083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.579866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.580300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.581379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.581887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.582439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.583923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.586085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.587597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.589146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.589647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.590127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.591035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.592246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.594059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.594552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.598086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.598591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.599090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.600368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.600868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.602799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.603303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.603807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.605363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.607579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.609069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.609675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.609740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.610084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.610702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.611202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.613019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.614179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.616446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.616519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.616571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.616623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.616971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.618471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.618538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.618591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.618648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.620334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.620396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.620466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.620535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.620878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.621059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.621114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.621166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.621217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.622801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.622863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.622913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.622967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.623474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.623661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.623736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.623789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.623845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.625527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.625601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.625653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.625711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.626123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.626309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.626366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.626417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.626481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.628080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.628145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.628205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.628261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.628598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.628787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.628854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.628905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.628959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.630593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.630655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.630714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.630774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.631112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.631291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.631347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.631398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.631448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.633202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.633274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.633329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.633380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.633724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.633903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.633958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.634012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.634072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.635550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.635613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.071 [2024-07-15 12:16:47.635664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.635723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.636062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.636243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.636300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.636352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.636404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.638338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.638412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.638464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.638522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.638867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.639047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.639106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.639157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.639209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.640865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.640935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.640992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.641042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.641454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.641641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.641705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.641757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.641807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.643424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.643485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.643536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.643588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.643933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.644116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.644179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.644231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.644284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.645815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.645878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.645930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.645981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.646520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.646710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.646770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.646822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.646891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.648562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.648624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.648677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.648747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.649087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.649267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.649328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.649391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.649443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.650939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.651009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.651062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.651115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.651667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.651850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.651906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.651957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.652008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.653521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.653584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.653635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.653693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.654223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.654402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.654457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.654513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.654565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.656096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.656159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.656226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.656278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.656750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.656925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.656983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.657035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.657087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.658549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.658612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.658663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.658729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.659144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.659321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.659381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.659441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.659494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.661037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.661099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.661150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.661203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.072 [2024-07-15 12:16:47.661709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.661888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.661949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.662001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.662052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.663547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.663610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.663668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.663726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.664063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.664247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.664306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.664357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.664408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.666042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.666105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.666173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.666226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.666696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.666873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.666928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.666980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.667031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.668594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.668670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.668732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.668783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.669155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.669332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.669390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.669441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.669499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.671261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.671324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.671375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.671428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.671770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.671956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.672019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.672075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.672125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.673782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.673845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.673896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.673947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.674361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.674549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.674608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.674659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.674717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.676408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.676471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.676523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.676579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.676942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.677119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.677174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.677225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.677275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.678780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.678848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.678905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.678956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.679291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.679470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.679529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.679580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.679646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.681603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.681668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.681726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.681777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.682167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.682345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.682422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.682478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.682528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.684107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.684169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.684220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.684272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.684606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.684796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.684853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.684903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.684954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.686807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.686871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.686929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.686987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.687320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.687499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.687554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.687604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.687655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.689295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.689366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.689421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.689472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.689820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.689997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.690057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.690115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.690169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.692023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.692085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.692137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.692188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.692523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.692706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.692773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.692824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.692875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.694432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.694494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.694545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.694596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.694942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.695121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.335 [2024-07-15 12:16:47.695178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.695230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.695308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.697246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.697308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.697359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.699251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.699605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.699795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.699851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.699912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.699964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.701496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.702016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.702509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.704270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.704611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.704796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.706796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.707889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.709631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.711675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.713424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.715406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.717395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.717874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.719735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.721732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.723531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.724027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.727652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.728538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.730277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.732276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.732616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.733227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.733729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.735472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.737462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.741030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.743016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.743532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.744023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.744362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.746479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.748482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.749356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.751141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.753179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.754939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.756935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.758923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.759456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.761316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.763307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.765176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.765671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.769304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.770204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.771957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.773953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.774293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.774906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.775398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.777202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.779172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.782715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.784705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.785204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.785700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.786044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.788102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.790099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.790980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.792716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.794818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.796618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.798537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.800535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.800994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.802868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.804864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.806854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.807349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.810926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.812055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.813797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.815791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.816133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.816948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.817443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.819241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.821117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.824575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.826561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.827221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.827716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.828060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.829976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.831968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.833077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.834864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.836905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.838482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.840285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.841937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.842292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.844148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.845950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.846456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.846969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.850325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.852138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.853900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.855441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.855851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.857761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.858263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.858769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.860505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.863969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.865213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.865731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.866318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.866660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.867767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.868757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.869260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.869756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.872293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.872808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.873303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.873805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.874313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.874930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.875427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.875928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.876418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.878908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.879410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.879912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.880404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.880830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.881439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.881942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.882435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.882944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.885144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.885641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.886140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.886639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.887148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.887775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.888276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.888782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.889272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.891751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.892256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.892754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.893246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.893715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.894332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.336 [2024-07-15 12:16:47.894834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.895333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.895853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.898024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.898526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.899033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.899528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.900044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.900660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.901179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.901672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.902170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.904706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.905214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.905709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.906201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.906774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.907383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.907891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.908387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.908881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.911318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.911830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.912329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.912831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.913220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.913834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.914332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.914832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.915327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.917721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.918224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.919415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.921378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.921889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.923990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.925118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.925606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.337 [2024-07-15 12:16:47.926955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.930355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.932353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.932865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.933358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.933736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.934350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.936024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.936515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.937019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.940496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.941011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.941503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.943318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.943661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.945495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.947298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.947800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.948288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.951330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.953333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.954626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.955127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.955530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.957374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.959367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.960820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.962614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.964553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.965896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.967633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.969549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.969905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.971811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.973729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.975693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.976622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.980125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.981619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.983426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.985323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.985668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.986871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.987369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.988848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.990589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.993883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.995880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.996818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.997309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.997653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:47.999498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.001533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.002816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.004614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.006616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.008176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.009977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.011982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.012395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.014335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.016360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.018367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.018986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.022538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.023694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.025440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.027422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.027770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.637 [2024-07-15 12:16:48.028653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.029157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.030969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.032859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.036307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.038295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.038934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.039426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.039774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.041692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.043683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.044729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.046471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.048512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.050257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.052092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.054056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.054550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.056477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.058459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.060441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.061006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.064560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.065605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.067345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.067405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.067754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.069886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.070396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.070892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.072680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.075954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.076023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.076075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.076126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.076465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.078401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.078470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.078523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.078588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.080451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.080515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.080574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.080629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.080975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.081156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.081213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.081270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.081330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.083003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.083076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.083127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.083179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.083602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.083795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.083853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.083904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.083955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.085583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.085646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.085707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.085759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.086099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.086283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.086339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.086391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.086443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.087944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.088008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.088059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.088117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.088669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.088856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.088918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.088970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.089021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.090638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.090725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.090778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.090828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.091255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.091437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.091496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.091547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.091598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.093146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.093210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.093262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.093323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.093886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.094065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.094121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.094174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.094226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.095748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.095811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.095861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.095920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.096390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.096568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.096623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.096674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.096740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.098243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.098306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.098357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.098408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.098911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.099097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.099162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.099214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.099266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.100789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.100852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.100904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.100956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.101294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.101473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.101531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.101582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.101634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.103302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.103366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.103434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.103487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.103942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.104126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.104182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.104232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.104286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.105785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.105852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.105904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.105955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.106368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.106549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.106606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.106658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.106727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.108571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.108637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.108698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.108751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.109092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.109277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.638 [2024-07-15 12:16:48.109338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.109389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.109440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.111076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.111140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.111200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.111254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.111594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.111787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.111844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.111895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.111946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.113501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.113564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.113622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.113673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.114066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.114245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.114300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.114350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.114401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.115903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.115969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.116025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.116081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.116419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.116596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.116652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.116715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.116776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.120816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.120880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.120934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.121000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.121338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.121516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.121577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.121628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.121693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.126315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.126379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.126431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.126484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.126828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.127009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.127065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.127118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.127171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.132406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.132473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.132525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.132577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.133044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.133229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.133284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.133341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.133394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.138907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.138972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.139024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.139075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.139449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.139629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.139692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.139772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.139827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.143844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.143908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.143967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.144024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.144362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.144539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.144599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.144649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.144711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.150169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.150233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.150305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.150357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.150835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.151015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.151070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.151121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.151171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.156047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.156120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.156171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.156222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.156559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.156750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.156818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.156871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.156923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.161029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.161101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.161158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.161209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.161573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.161761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.161831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.161883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.161939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.165916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.165980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.166037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.166098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.166438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.166618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.166679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.166791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.166848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.170701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.170768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.170820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.170872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.171266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.171454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.171509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.171567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.171621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.175415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.175480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.175533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.175588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.175967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.176150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.176205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.176257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.176308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.179997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.180063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.180116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.180168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.180506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.180693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.180759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.180810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.180861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.184991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.185057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.185108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.185184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.185522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.185712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.185773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.185824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.185880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.188574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.639 [2024-07-15 12:16:48.188639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.188698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.188750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.189167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.189347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.189403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.189455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.189506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.192355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.192420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.192474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.192526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.192957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.193141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.193198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.193251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.193302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.194301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.194367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.194419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.194923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.195411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.195595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.195652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.195713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.195784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.197758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.198262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.198764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.199280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.199801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.199982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.200478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.200988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.201483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.203650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.204163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.204691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.205191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.205716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.206331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.206842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.207335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.207831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.210203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.210720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.211214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:34.640 [2024-07-15 12:16:48.211711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:35:37.922 00:35:37.922 Latency(us) 00:35:37.922 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:37.922 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:37.922 Verification LBA range: start 0x0 length 0x100 00:35:37.922 crypto_ram : 5.72 44.73 2.80 0.00 0.00 2774842.32 69753.10 2494699.07 00:35:37.922 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:37.922 Verification LBA range: start 0x100 length 0x100 00:35:37.922 crypto_ram : 6.02 41.86 2.62 0.00 0.00 2954473.24 144065.22 3063665.53 00:35:37.922 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:37.922 Verification LBA range: start 0x0 length 0x100 00:35:37.922 crypto_ram2 : 5.72 44.72 2.80 0.00 0.00 2681072.42 69297.20 2494699.07 00:35:37.922 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:37.922 Verification LBA range: start 0x100 length 0x100 00:35:37.922 crypto_ram2 : 6.02 42.50 2.66 0.00 0.00 2802987.19 65194.07 3063665.53 00:35:37.922 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:37.922 Verification LBA range: start 0x0 length 0x100 00:35:37.922 crypto_ram3 : 5.57 304.47 19.03 0.00 0.00 377551.58 56987.83 645558.09 00:35:37.922 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:37.922 Verification LBA range: start 0x100 length 0x100 00:35:37.922 crypto_ram3 : 5.67 234.00 14.63 0.00 0.00 479729.36 18350.08 601791.44 00:35:37.922 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:35:37.922 Verification LBA range: start 0x0 length 0x100 00:35:37.922 crypto_ram4 : 5.64 319.17 19.95 0.00 0.00 350893.04 18008.15 485080.38 00:35:37.922 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:35:37.923 Verification LBA range: start 0x100 length 0x100 00:35:37.923 crypto_ram4 : 5.80 252.48 15.78 0.00 0.00 434158.95 15272.74 594497.00 00:35:37.923 =================================================================================================================== 00:35:37.923 Total : 1283.94 80.25 0.00 0.00 739506.28 15272.74 3063665.53 00:35:38.491 00:35:38.491 real 0m9.254s 00:35:38.491 user 0m17.530s 00:35:38.491 sys 0m0.465s 00:35:38.491 12:16:51 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:38.491 12:16:51 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:35:38.491 ************************************ 00:35:38.491 END TEST bdev_verify_big_io 00:35:38.491 ************************************ 00:35:38.491 12:16:51 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:35:38.491 12:16:51 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:38.491 12:16:51 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:38.491 12:16:51 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:38.491 12:16:51 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:35:38.491 ************************************ 00:35:38.491 START TEST bdev_write_zeroes 00:35:38.491 ************************************ 00:35:38.491 12:16:51 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:38.491 [2024-07-15 12:16:51.971079] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:35:38.491 [2024-07-15 12:16:51.971129] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1657278 ] 00:35:38.491 [2024-07-15 12:16:52.082540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:38.750 [2024-07-15 12:16:52.187400] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:38.750 [2024-07-15 12:16:52.208699] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:35:38.750 [2024-07-15 12:16:52.216726] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:35:38.750 [2024-07-15 12:16:52.224739] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:35:38.750 [2024-07-15 12:16:52.340397] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:35:41.322 [2024-07-15 12:16:54.564929] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:35:41.322 [2024-07-15 12:16:54.565010] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:41.322 [2024-07-15 12:16:54.565026] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:41.322 [2024-07-15 12:16:54.572948] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:35:41.322 [2024-07-15 12:16:54.572968] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:41.322 [2024-07-15 12:16:54.572980] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:41.322 [2024-07-15 12:16:54.580968] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:35:41.322 [2024-07-15 12:16:54.580991] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:35:41.322 [2024-07-15 12:16:54.581002] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:41.322 [2024-07-15 12:16:54.588989] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:35:41.322 [2024-07-15 12:16:54.589006] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:35:41.322 [2024-07-15 12:16:54.589018] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:41.322 Running I/O for 1 seconds... 00:35:42.255 00:35:42.255 Latency(us) 00:35:42.255 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:35:42.255 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:42.255 crypto_ram : 1.02 1974.17 7.71 0.00 0.00 64359.70 5442.34 77503.44 00:35:42.255 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:42.255 crypto_ram2 : 1.03 1979.89 7.73 0.00 0.00 63813.04 5413.84 72032.61 00:35:42.255 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:42.255 crypto_ram3 : 1.02 15126.09 59.09 0.00 0.00 8327.13 2464.72 10770.70 00:35:42.255 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:35:42.255 crypto_ram4 : 1.02 15163.28 59.23 0.00 0.00 8281.05 2464.72 8719.14 00:35:42.255 =================================================================================================================== 00:35:42.255 Total : 34243.43 133.76 0.00 0.00 14774.04 2464.72 77503.44 00:35:42.513 00:35:42.513 real 0m4.175s 00:35:42.513 user 0m3.766s 00:35:42.513 sys 0m0.362s 00:35:42.513 12:16:56 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:42.513 12:16:56 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:35:42.513 ************************************ 00:35:42.513 END TEST bdev_write_zeroes 00:35:42.513 ************************************ 00:35:42.772 12:16:56 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 0 00:35:42.772 12:16:56 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:42.772 12:16:56 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:42.772 12:16:56 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:42.772 12:16:56 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:35:42.772 ************************************ 00:35:42.772 START TEST bdev_json_nonenclosed 00:35:42.772 ************************************ 00:35:42.772 12:16:56 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:42.772 [2024-07-15 12:16:56.286227] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:35:42.772 [2024-07-15 12:16:56.286354] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1657818 ] 00:35:43.031 [2024-07-15 12:16:56.480194] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:43.031 [2024-07-15 12:16:56.583939] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:43.031 [2024-07-15 12:16:56.584006] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:35:43.031 [2024-07-15 12:16:56.584027] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:43.031 [2024-07-15 12:16:56.584042] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:43.290 00:35:43.290 real 0m0.508s 00:35:43.290 user 0m0.288s 00:35:43.290 sys 0m0.215s 00:35:43.290 12:16:56 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:35:43.290 12:16:56 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:43.290 12:16:56 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:35:43.290 ************************************ 00:35:43.290 END TEST bdev_json_nonenclosed 00:35:43.290 ************************************ 00:35:43.290 12:16:56 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:35:43.290 12:16:56 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # true 00:35:43.290 12:16:56 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:43.290 12:16:56 blockdev_crypto_aesni -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:35:43.290 12:16:56 blockdev_crypto_aesni -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:43.290 12:16:56 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:35:43.290 ************************************ 00:35:43.290 START TEST bdev_json_nonarray 00:35:43.290 ************************************ 00:35:43.290 12:16:56 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:35:43.290 [2024-07-15 12:16:56.839646] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:35:43.290 [2024-07-15 12:16:56.839712] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1657989 ] 00:35:43.549 [2024-07-15 12:16:56.969009] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:43.549 [2024-07-15 12:16:57.069228] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:43.549 [2024-07-15 12:16:57.069302] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:35:43.549 [2024-07-15 12:16:57.069323] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:35:43.549 [2024-07-15 12:16:57.069335] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:35:43.807 00:35:43.807 real 0m0.399s 00:35:43.807 user 0m0.234s 00:35:43.807 sys 0m0.162s 00:35:43.807 12:16:57 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:35:43.807 12:16:57 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:43.807 12:16:57 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:35:43.807 ************************************ 00:35:43.807 END TEST bdev_json_nonarray 00:35:43.807 ************************************ 00:35:43.807 12:16:57 blockdev_crypto_aesni -- common/autotest_common.sh@1142 -- # return 234 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # true 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:35:43.807 12:16:57 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:35:43.807 00:35:43.807 real 1m13.429s 00:35:43.807 user 2m42.048s 00:35:43.807 sys 0m9.322s 00:35:43.807 12:16:57 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:43.807 12:16:57 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:35:43.807 ************************************ 00:35:43.807 END TEST blockdev_crypto_aesni 00:35:43.807 ************************************ 00:35:43.807 12:16:57 -- common/autotest_common.sh@1142 -- # return 0 00:35:43.807 12:16:57 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:35:43.807 12:16:57 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:35:43.807 12:16:57 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:43.807 12:16:57 -- common/autotest_common.sh@10 -- # set +x 00:35:43.807 ************************************ 00:35:43.807 START TEST blockdev_crypto_sw 00:35:43.807 ************************************ 00:35:43.807 12:16:57 blockdev_crypto_sw -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:35:44.065 * Looking for test storage... 00:35:44.065 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1658079 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:35:44.065 12:16:57 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1658079 00:35:44.065 12:16:57 blockdev_crypto_sw -- common/autotest_common.sh@829 -- # '[' -z 1658079 ']' 00:35:44.065 12:16:57 blockdev_crypto_sw -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:44.065 12:16:57 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:44.065 12:16:57 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:44.065 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:44.065 12:16:57 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:44.065 12:16:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:44.065 [2024-07-15 12:16:57.515374] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:35:44.065 [2024-07-15 12:16:57.515428] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1658079 ] 00:35:44.065 [2024-07-15 12:16:57.620967] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:44.325 [2024-07-15 12:16:57.724317] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:44.890 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:44.890 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@862 -- # return 0 00:35:44.890 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:35:44.890 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:35:44.890 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:35:44.890 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:44.890 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:45.148 Malloc0 00:35:45.148 Malloc1 00:35:45.148 true 00:35:45.148 true 00:35:45.148 true 00:35:45.148 [2024-07-15 12:16:58.729313] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:45.148 crypto_ram 00:35:45.148 [2024-07-15 12:16:58.737343] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:45.148 crypto_ram2 00:35:45.407 [2024-07-15 12:16:58.745364] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:45.407 crypto_ram3 00:35:45.407 [ 00:35:45.407 { 00:35:45.407 "name": "Malloc1", 00:35:45.407 "aliases": [ 00:35:45.407 "9253c8fb-762a-4771-a4d6-a3c9db2c7e05" 00:35:45.407 ], 00:35:45.407 "product_name": "Malloc disk", 00:35:45.407 "block_size": 4096, 00:35:45.407 "num_blocks": 4096, 00:35:45.407 "uuid": "9253c8fb-762a-4771-a4d6-a3c9db2c7e05", 00:35:45.407 "assigned_rate_limits": { 00:35:45.407 "rw_ios_per_sec": 0, 00:35:45.407 "rw_mbytes_per_sec": 0, 00:35:45.407 "r_mbytes_per_sec": 0, 00:35:45.407 "w_mbytes_per_sec": 0 00:35:45.407 }, 00:35:45.407 "claimed": true, 00:35:45.407 "claim_type": "exclusive_write", 00:35:45.407 "zoned": false, 00:35:45.407 "supported_io_types": { 00:35:45.407 "read": true, 00:35:45.407 "write": true, 00:35:45.407 "unmap": true, 00:35:45.407 "flush": true, 00:35:45.407 "reset": true, 00:35:45.407 "nvme_admin": false, 00:35:45.407 "nvme_io": false, 00:35:45.407 "nvme_io_md": false, 00:35:45.407 "write_zeroes": true, 00:35:45.407 "zcopy": true, 00:35:45.407 "get_zone_info": false, 00:35:45.407 "zone_management": false, 00:35:45.407 "zone_append": false, 00:35:45.407 "compare": false, 00:35:45.407 "compare_and_write": false, 00:35:45.407 "abort": true, 00:35:45.407 "seek_hole": false, 00:35:45.407 "seek_data": false, 00:35:45.407 "copy": true, 00:35:45.407 "nvme_iov_md": false 00:35:45.407 }, 00:35:45.407 "memory_domains": [ 00:35:45.407 { 00:35:45.407 "dma_device_id": "system", 00:35:45.407 "dma_device_type": 1 00:35:45.407 }, 00:35:45.407 { 00:35:45.407 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:35:45.407 "dma_device_type": 2 00:35:45.407 } 00:35:45.407 ], 00:35:45.407 "driver_specific": {} 00:35:45.407 } 00:35:45.407 ] 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:45.407 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:45.407 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:35:45.407 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:45.407 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:45.407 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:45.407 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:35:45.407 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@559 -- # xtrace_disable 00:35:45.407 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:45.407 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:35:45.407 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:35:45.408 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "6fd5e9ec-0465-5ec6-8a05-fb30ff13a029"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6fd5e9ec-0465-5ec6-8a05-fb30ff13a029",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e670f7db-2151-5b89-af25-b0d901e305f5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "e670f7db-2151-5b89-af25-b0d901e305f5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:35:45.408 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:35:45.408 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:35:45.408 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:35:45.408 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:35:45.408 12:16:58 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 1658079 00:35:45.408 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@948 -- # '[' -z 1658079 ']' 00:35:45.408 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@952 -- # kill -0 1658079 00:35:45.408 12:16:58 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # uname 00:35:45.666 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:45.666 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1658079 00:35:45.666 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:45.666 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:45.666 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1658079' 00:35:45.666 killing process with pid 1658079 00:35:45.666 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # kill 1658079 00:35:45.666 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@972 -- # wait 1658079 00:35:45.923 12:16:59 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:35:45.923 12:16:59 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:35:45.923 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:35:45.923 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:45.923 12:16:59 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:45.923 ************************************ 00:35:45.923 START TEST bdev_hello_world 00:35:45.923 ************************************ 00:35:45.923 12:16:59 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:35:46.182 [2024-07-15 12:16:59.540633] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:35:46.182 [2024-07-15 12:16:59.540707] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1658367 ] 00:35:46.182 [2024-07-15 12:16:59.667787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:46.182 [2024-07-15 12:16:59.768675] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:46.440 [2024-07-15 12:16:59.948071] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:46.440 [2024-07-15 12:16:59.948141] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:46.440 [2024-07-15 12:16:59.948156] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:46.440 [2024-07-15 12:16:59.956089] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:46.440 [2024-07-15 12:16:59.956108] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:46.440 [2024-07-15 12:16:59.956120] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:46.440 [2024-07-15 12:16:59.964110] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:46.440 [2024-07-15 12:16:59.964128] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:35:46.440 [2024-07-15 12:16:59.964140] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:46.440 [2024-07-15 12:17:00.005816] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:35:46.440 [2024-07-15 12:17:00.005853] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:35:46.440 [2024-07-15 12:17:00.005872] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:35:46.440 [2024-07-15 12:17:00.007793] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:35:46.440 [2024-07-15 12:17:00.007871] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:35:46.440 [2024-07-15 12:17:00.007887] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:35:46.440 [2024-07-15 12:17:00.007926] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:35:46.440 00:35:46.440 [2024-07-15 12:17:00.007944] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:35:46.698 00:35:46.698 real 0m0.751s 00:35:46.698 user 0m0.491s 00:35:46.698 sys 0m0.244s 00:35:46.698 12:17:00 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:46.698 12:17:00 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:35:46.698 ************************************ 00:35:46.698 END TEST bdev_hello_world 00:35:46.698 ************************************ 00:35:46.698 12:17:00 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:46.698 12:17:00 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:35:46.698 12:17:00 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:35:46.698 12:17:00 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:46.698 12:17:00 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:46.974 ************************************ 00:35:46.974 START TEST bdev_bounds 00:35:46.974 ************************************ 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1658467 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1658467' 00:35:46.974 Process bdevio pid: 1658467 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1658467 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1658467 ']' 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:35:46.974 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:46.974 12:17:00 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:35:46.974 [2024-07-15 12:17:00.375143] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:35:46.974 [2024-07-15 12:17:00.375209] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1658467 ] 00:35:46.974 [2024-07-15 12:17:00.503891] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:35:47.231 [2024-07-15 12:17:00.613701] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:35:47.231 [2024-07-15 12:17:00.613733] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:35:47.231 [2024-07-15 12:17:00.613734] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:47.231 [2024-07-15 12:17:00.791970] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:47.231 [2024-07-15 12:17:00.792039] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:47.231 [2024-07-15 12:17:00.792053] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:47.231 [2024-07-15 12:17:00.799988] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:47.231 [2024-07-15 12:17:00.800007] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:47.231 [2024-07-15 12:17:00.800018] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:47.231 [2024-07-15 12:17:00.808012] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:47.231 [2024-07-15 12:17:00.808036] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:35:47.231 [2024-07-15 12:17:00.808047] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:47.809 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:47.809 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:35:47.809 12:17:01 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:35:48.130 I/O targets: 00:35:48.130 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:35:48.130 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:35:48.130 00:35:48.130 00:35:48.130 CUnit - A unit testing framework for C - Version 2.1-3 00:35:48.130 http://cunit.sourceforge.net/ 00:35:48.130 00:35:48.130 00:35:48.130 Suite: bdevio tests on: crypto_ram3 00:35:48.130 Test: blockdev write read block ...passed 00:35:48.130 Test: blockdev write zeroes read block ...passed 00:35:48.131 Test: blockdev write zeroes read no split ...passed 00:35:48.131 Test: blockdev write zeroes read split ...passed 00:35:48.131 Test: blockdev write zeroes read split partial ...passed 00:35:48.131 Test: blockdev reset ...passed 00:35:48.131 Test: blockdev write read 8 blocks ...passed 00:35:48.131 Test: blockdev write read size > 128k ...passed 00:35:48.131 Test: blockdev write read invalid size ...passed 00:35:48.131 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:48.131 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:48.131 Test: blockdev write read max offset ...passed 00:35:48.131 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:48.131 Test: blockdev writev readv 8 blocks ...passed 00:35:48.131 Test: blockdev writev readv 30 x 1block ...passed 00:35:48.131 Test: blockdev writev readv block ...passed 00:35:48.131 Test: blockdev writev readv size > 128k ...passed 00:35:48.131 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:48.131 Test: blockdev comparev and writev ...passed 00:35:48.131 Test: blockdev nvme passthru rw ...passed 00:35:48.131 Test: blockdev nvme passthru vendor specific ...passed 00:35:48.131 Test: blockdev nvme admin passthru ...passed 00:35:48.131 Test: blockdev copy ...passed 00:35:48.131 Suite: bdevio tests on: crypto_ram 00:35:48.131 Test: blockdev write read block ...passed 00:35:48.131 Test: blockdev write zeroes read block ...passed 00:35:48.131 Test: blockdev write zeroes read no split ...passed 00:35:48.131 Test: blockdev write zeroes read split ...passed 00:35:48.131 Test: blockdev write zeroes read split partial ...passed 00:35:48.131 Test: blockdev reset ...passed 00:35:48.131 Test: blockdev write read 8 blocks ...passed 00:35:48.131 Test: blockdev write read size > 128k ...passed 00:35:48.131 Test: blockdev write read invalid size ...passed 00:35:48.131 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:35:48.131 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:35:48.131 Test: blockdev write read max offset ...passed 00:35:48.131 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:35:48.131 Test: blockdev writev readv 8 blocks ...passed 00:35:48.131 Test: blockdev writev readv 30 x 1block ...passed 00:35:48.131 Test: blockdev writev readv block ...passed 00:35:48.131 Test: blockdev writev readv size > 128k ...passed 00:35:48.131 Test: blockdev writev readv size > 128k in two iovs ...passed 00:35:48.131 Test: blockdev comparev and writev ...passed 00:35:48.131 Test: blockdev nvme passthru rw ...passed 00:35:48.131 Test: blockdev nvme passthru vendor specific ...passed 00:35:48.131 Test: blockdev nvme admin passthru ...passed 00:35:48.131 Test: blockdev copy ...passed 00:35:48.131 00:35:48.131 Run Summary: Type Total Ran Passed Failed Inactive 00:35:48.131 suites 2 2 n/a 0 0 00:35:48.131 tests 46 46 46 0 0 00:35:48.131 asserts 260 260 260 0 n/a 00:35:48.131 00:35:48.131 Elapsed time = 0.199 seconds 00:35:48.131 0 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1658467 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1658467 ']' 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1658467 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1658467 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1658467' 00:35:48.131 killing process with pid 1658467 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1658467 00:35:48.131 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1658467 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:35:48.390 00:35:48.390 real 0m1.512s 00:35:48.390 user 0m3.852s 00:35:48.390 sys 0m0.404s 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:35:48.390 ************************************ 00:35:48.390 END TEST bdev_bounds 00:35:48.390 ************************************ 00:35:48.390 12:17:01 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:48.390 12:17:01 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:35:48.390 12:17:01 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:35:48.390 12:17:01 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:48.390 12:17:01 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:48.390 ************************************ 00:35:48.390 START TEST bdev_nbd 00:35:48.390 ************************************ 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1658679 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1658679 /var/tmp/spdk-nbd.sock 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1658679 ']' 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:35:48.390 12:17:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:35:48.390 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:35:48.391 12:17:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:35:48.391 12:17:01 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:35:48.391 [2024-07-15 12:17:01.980094] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:35:48.391 [2024-07-15 12:17:01.980157] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:35:48.650 [2024-07-15 12:17:02.109317] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:35:48.650 [2024-07-15 12:17:02.215226] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:35:48.909 [2024-07-15 12:17:02.399213] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:35:48.909 [2024-07-15 12:17:02.399284] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:35:48.909 [2024-07-15 12:17:02.399299] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:48.909 [2024-07-15 12:17:02.407231] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:35:48.909 [2024-07-15 12:17:02.407250] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:35:48.909 [2024-07-15 12:17:02.407261] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:48.909 [2024-07-15 12:17:02.415252] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:35:48.909 [2024-07-15 12:17:02.415271] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:35:48.909 [2024-07-15 12:17:02.415283] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:35:49.475 12:17:02 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:49.732 1+0 records in 00:35:49.732 1+0 records out 00:35:49.732 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000254463 s, 16.1 MB/s 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:35:49.732 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:49.989 1+0 records in 00:35:49.989 1+0 records out 00:35:49.989 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000314368 s, 13.0 MB/s 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:35:49.989 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:35:50.247 { 00:35:50.247 "nbd_device": "/dev/nbd0", 00:35:50.247 "bdev_name": "crypto_ram" 00:35:50.247 }, 00:35:50.247 { 00:35:50.247 "nbd_device": "/dev/nbd1", 00:35:50.247 "bdev_name": "crypto_ram3" 00:35:50.247 } 00:35:50.247 ]' 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:35:50.247 { 00:35:50.247 "nbd_device": "/dev/nbd0", 00:35:50.247 "bdev_name": "crypto_ram" 00:35:50.247 }, 00:35:50.247 { 00:35:50.247 "nbd_device": "/dev/nbd1", 00:35:50.247 "bdev_name": "crypto_ram3" 00:35:50.247 } 00:35:50.247 ]' 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:50.247 12:17:03 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:50.506 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:50.764 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:35:51.022 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:35:51.280 /dev/nbd0 00:35:51.280 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:35:51.280 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:35:51.280 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:35:51.280 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:51.280 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:51.280 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:51.280 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:51.538 1+0 records in 00:35:51.538 1+0 records out 00:35:51.538 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278229 s, 14.7 MB/s 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:35:51.538 12:17:04 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:35:51.797 /dev/nbd1 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:35:51.797 1+0 records in 00:35:51.797 1+0 records out 00:35:51.797 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000364187 s, 11.2 MB/s 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:51.797 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:35:52.057 { 00:35:52.057 "nbd_device": "/dev/nbd0", 00:35:52.057 "bdev_name": "crypto_ram" 00:35:52.057 }, 00:35:52.057 { 00:35:52.057 "nbd_device": "/dev/nbd1", 00:35:52.057 "bdev_name": "crypto_ram3" 00:35:52.057 } 00:35:52.057 ]' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:35:52.057 { 00:35:52.057 "nbd_device": "/dev/nbd0", 00:35:52.057 "bdev_name": "crypto_ram" 00:35:52.057 }, 00:35:52.057 { 00:35:52.057 "nbd_device": "/dev/nbd1", 00:35:52.057 "bdev_name": "crypto_ram3" 00:35:52.057 } 00:35:52.057 ]' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:35:52.057 /dev/nbd1' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:35:52.057 /dev/nbd1' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:35:52.057 256+0 records in 00:35:52.057 256+0 records out 00:35:52.057 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0108396 s, 96.7 MB/s 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:35:52.057 256+0 records in 00:35:52.057 256+0 records out 00:35:52.057 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0303752 s, 34.5 MB/s 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:35:52.057 256+0 records in 00:35:52.057 256+0 records out 00:35:52.057 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0456515 s, 23.0 MB/s 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:52.057 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:52.316 12:17:05 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:52.575 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:35:52.834 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:35:53.094 malloc_lvol_verify 00:35:53.094 12:17:06 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:35:53.663 a4647694-bb70-4992-b2b2-b2f0b7bf6658 00:35:53.663 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:35:54.260 eb29887b-f592-4081-9bbb-501bda23e0e6 00:35:54.260 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:35:54.520 /dev/nbd0 00:35:54.520 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:35:54.520 mke2fs 1.46.5 (30-Dec-2021) 00:35:54.520 Discarding device blocks: 0/4096 done 00:35:54.520 Creating filesystem with 4096 1k blocks and 1024 inodes 00:35:54.520 00:35:54.520 Allocating group tables: 0/1 done 00:35:54.520 Writing inode tables: 0/1 done 00:35:54.520 Creating journal (1024 blocks): done 00:35:54.520 Writing superblocks and filesystem accounting information: 0/1 done 00:35:54.520 00:35:54.520 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:35:54.520 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:35:54.520 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:35:54.520 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:35:54.520 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:35:54.520 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:35:54.520 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:35:54.520 12:17:07 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1658679 00:35:54.779 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1658679 ']' 00:35:54.780 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1658679 00:35:54.780 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:35:54.780 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:35:54.780 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1658679 00:35:54.780 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:35:54.780 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:35:54.780 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1658679' 00:35:54.780 killing process with pid 1658679 00:35:54.780 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1658679 00:35:54.780 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1658679 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:35:55.090 00:35:55.090 real 0m6.584s 00:35:55.090 user 0m9.545s 00:35:55.090 sys 0m2.546s 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:35:55.090 ************************************ 00:35:55.090 END TEST bdev_nbd 00:35:55.090 ************************************ 00:35:55.090 12:17:08 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:35:55.090 12:17:08 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:35:55.090 12:17:08 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:35:55.090 12:17:08 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:35:55.090 12:17:08 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:35:55.090 12:17:08 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:35:55.090 12:17:08 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:55.090 12:17:08 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:35:55.090 ************************************ 00:35:55.090 START TEST bdev_fio 00:35:55.090 ************************************ 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:35:55.090 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:35:55.090 12:17:08 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:35:55.350 ************************************ 00:35:55.350 START TEST bdev_fio_rw_verify 00:35:55.350 ************************************ 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:35:55.350 12:17:08 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:55.609 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:55.609 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:35:55.609 fio-3.35 00:35:55.609 Starting 2 threads 00:36:07.832 00:36:07.832 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1659950: Mon Jul 15 12:17:19 2024 00:36:07.832 read: IOPS=20.9k, BW=81.8MiB/s (85.8MB/s)(818MiB/10001msec) 00:36:07.832 slat (usec): min=14, max=797, avg=21.05, stdev= 5.83 00:36:07.832 clat (usec): min=7, max=1460, avg=153.59, stdev=67.04 00:36:07.832 lat (usec): min=26, max=1490, avg=174.64, stdev=70.01 00:36:07.832 clat percentiles (usec): 00:36:07.832 | 50.000th=[ 149], 99.000th=[ 351], 99.900th=[ 449], 99.990th=[ 635], 00:36:07.832 | 99.999th=[ 824] 00:36:07.832 write: IOPS=25.2k, BW=98.3MiB/s (103MB/s)(933MiB/9488msec); 0 zone resets 00:36:07.832 slat (usec): min=14, max=2349, avg=35.04, stdev= 7.91 00:36:07.832 clat (usec): min=25, max=2759, avg=206.00, stdev=99.31 00:36:07.832 lat (usec): min=53, max=2800, avg=241.04, stdev=101.74 00:36:07.832 clat percentiles (usec): 00:36:07.832 | 50.000th=[ 200], 99.000th=[ 469], 99.900th=[ 553], 99.990th=[ 816], 00:36:07.832 | 99.999th=[ 2671] 00:36:07.832 bw ( KiB/s): min=67720, max=104016, per=94.96%, avg=95584.42, stdev=5162.96, samples=38 00:36:07.832 iops : min=16930, max=26004, avg=23896.11, stdev=1290.74, samples=38 00:36:07.832 lat (usec) : 10=0.01%, 20=0.01%, 50=4.22%, 100=13.99%, 250=61.05% 00:36:07.832 lat (usec) : 500=20.41%, 750=0.31%, 1000=0.01% 00:36:07.832 lat (msec) : 2=0.01%, 4=0.01% 00:36:07.832 cpu : usr=99.56%, sys=0.01%, ctx=48, majf=0, minf=561 00:36:07.832 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:36:07.832 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:07.832 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:07.832 issued rwts: total=209391,238758,0,0 short=0,0,0,0 dropped=0,0,0,0 00:36:07.832 latency : target=0, window=0, percentile=100.00%, depth=8 00:36:07.832 00:36:07.832 Run status group 0 (all jobs): 00:36:07.832 READ: bw=81.8MiB/s (85.8MB/s), 81.8MiB/s-81.8MiB/s (85.8MB/s-85.8MB/s), io=818MiB (858MB), run=10001-10001msec 00:36:07.832 WRITE: bw=98.3MiB/s (103MB/s), 98.3MiB/s-98.3MiB/s (103MB/s-103MB/s), io=933MiB (978MB), run=9488-9488msec 00:36:07.832 00:36:07.832 real 0m11.223s 00:36:07.832 user 0m24.082s 00:36:07.832 sys 0m0.393s 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:36:07.832 ************************************ 00:36:07.832 END TEST bdev_fio_rw_verify 00:36:07.832 ************************************ 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:36:07.832 12:17:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:36:07.833 12:17:19 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "6fd5e9ec-0465-5ec6-8a05-fb30ff13a029"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6fd5e9ec-0465-5ec6-8a05-fb30ff13a029",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e670f7db-2151-5b89-af25-b0d901e305f5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "e670f7db-2151-5b89-af25-b0d901e305f5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:36:07.833 crypto_ram3 ]] 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "6fd5e9ec-0465-5ec6-8a05-fb30ff13a029"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "6fd5e9ec-0465-5ec6-8a05-fb30ff13a029",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "e670f7db-2151-5b89-af25-b0d901e305f5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "e670f7db-2151-5b89-af25-b0d901e305f5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:36:07.833 ************************************ 00:36:07.833 START TEST bdev_fio_trim 00:36:07.833 ************************************ 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:36:07.833 12:17:20 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:36:07.833 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:07.833 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:36:07.833 fio-3.35 00:36:07.833 Starting 2 threads 00:36:17.899 00:36:17.899 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1661458: Mon Jul 15 12:17:31 2024 00:36:17.899 write: IOPS=39.9k, BW=156MiB/s (163MB/s)(1559MiB/10001msec); 0 zone resets 00:36:17.899 slat (usec): min=14, max=1643, avg=22.01, stdev= 4.98 00:36:17.899 clat (usec): min=37, max=1963, avg=164.60, stdev=90.90 00:36:17.899 lat (usec): min=51, max=1990, avg=186.61, stdev=94.24 00:36:17.899 clat percentiles (usec): 00:36:17.899 | 50.000th=[ 133], 99.000th=[ 343], 99.900th=[ 367], 99.990th=[ 537], 00:36:17.899 | 99.999th=[ 1893] 00:36:17.899 bw ( KiB/s): min=153288, max=161088, per=100.00%, avg=159695.16, stdev=834.19, samples=38 00:36:17.899 iops : min=38322, max=40272, avg=39924.00, stdev=208.61, samples=38 00:36:17.899 trim: IOPS=39.9k, BW=156MiB/s (163MB/s)(1559MiB/10001msec); 0 zone resets 00:36:17.899 slat (usec): min=6, max=346, avg= 9.91, stdev= 2.26 00:36:17.899 clat (usec): min=9, max=1765, avg=109.82, stdev=33.16 00:36:17.899 lat (usec): min=17, max=1778, avg=119.73, stdev=33.30 00:36:17.899 clat percentiles (usec): 00:36:17.899 | 50.000th=[ 112], 99.000th=[ 180], 99.900th=[ 194], 99.990th=[ 289], 00:36:17.899 | 99.999th=[ 586] 00:36:17.899 bw ( KiB/s): min=153312, max=161088, per=100.00%, avg=159696.84, stdev=832.10, samples=38 00:36:17.899 iops : min=38328, max=40272, avg=39924.32, stdev=208.10, samples=38 00:36:17.899 lat (usec) : 10=0.01%, 50=4.03%, 100=33.69%, 250=49.34%, 500=12.93% 00:36:17.899 lat (usec) : 750=0.01% 00:36:17.899 lat (msec) : 2=0.01% 00:36:17.899 cpu : usr=99.62%, sys=0.00%, ctx=17, majf=0, minf=254 00:36:17.899 IO depths : 1=7.5%, 2=17.4%, 4=60.1%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:36:17.899 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:17.899 complete : 0=0.0%, 4=86.9%, 8=13.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:36:17.899 issued rwts: total=0,399167,399168,0 short=0,0,0,0 dropped=0,0,0,0 00:36:17.899 latency : target=0, window=0, percentile=100.00%, depth=8 00:36:17.899 00:36:17.899 Run status group 0 (all jobs): 00:36:17.899 WRITE: bw=156MiB/s (163MB/s), 156MiB/s-156MiB/s (163MB/s-163MB/s), io=1559MiB (1635MB), run=10001-10001msec 00:36:17.899 TRIM: bw=156MiB/s (163MB/s), 156MiB/s-156MiB/s (163MB/s-163MB/s), io=1559MiB (1635MB), run=10001-10001msec 00:36:17.899 00:36:17.899 real 0m11.220s 00:36:17.899 user 0m23.645s 00:36:17.899 sys 0m0.379s 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:36:17.899 ************************************ 00:36:17.899 END TEST bdev_fio_trim 00:36:17.899 ************************************ 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:36:17.899 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:36:17.899 00:36:17.899 real 0m22.820s 00:36:17.899 user 0m47.923s 00:36:17.899 sys 0m0.975s 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:36:17.899 ************************************ 00:36:17.899 END TEST bdev_fio 00:36:17.899 ************************************ 00:36:17.899 12:17:31 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:36:17.899 12:17:31 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:36:17.899 12:17:31 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:36:17.899 12:17:31 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:36:17.899 12:17:31 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:17.899 12:17:31 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:36:17.899 ************************************ 00:36:17.899 START TEST bdev_verify 00:36:17.899 ************************************ 00:36:17.899 12:17:31 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:36:18.160 [2024-07-15 12:17:31.542589] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:18.160 [2024-07-15 12:17:31.542652] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1662718 ] 00:36:18.160 [2024-07-15 12:17:31.673313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:36:18.419 [2024-07-15 12:17:31.775954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:18.420 [2024-07-15 12:17:31.775959] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:18.420 [2024-07-15 12:17:31.950591] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:36:18.420 [2024-07-15 12:17:31.950664] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:18.420 [2024-07-15 12:17:31.950679] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:18.420 [2024-07-15 12:17:31.958614] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:36:18.420 [2024-07-15 12:17:31.958634] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:18.420 [2024-07-15 12:17:31.958646] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:18.420 [2024-07-15 12:17:31.966637] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:36:18.420 [2024-07-15 12:17:31.966655] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:36:18.420 [2024-07-15 12:17:31.966667] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:18.678 Running I/O for 5 seconds... 00:36:23.949 00:36:23.950 Latency(us) 00:36:23.950 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:23.950 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:23.950 Verification LBA range: start 0x0 length 0x800 00:36:23.950 crypto_ram : 5.02 5993.72 23.41 0.00 0.00 21273.94 1695.39 23365.01 00:36:23.950 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:23.950 Verification LBA range: start 0x800 length 0x800 00:36:23.950 crypto_ram : 5.02 4814.34 18.81 0.00 0.00 26476.73 2279.51 27126.21 00:36:23.950 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:36:23.950 Verification LBA range: start 0x0 length 0x800 00:36:23.950 crypto_ram3 : 5.03 3005.16 11.74 0.00 0.00 42366.57 1809.36 28265.96 00:36:23.950 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:36:23.950 Verification LBA range: start 0x800 length 0x800 00:36:23.950 crypto_ram3 : 5.03 2415.83 9.44 0.00 0.00 52655.69 2194.03 33736.79 00:36:23.950 =================================================================================================================== 00:36:23.950 Total : 16229.06 63.39 0.00 0.00 31405.06 1695.39 33736.79 00:36:23.950 00:36:23.950 real 0m5.799s 00:36:23.950 user 0m10.893s 00:36:23.950 sys 0m0.246s 00:36:23.950 12:17:37 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:23.950 12:17:37 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:36:23.950 ************************************ 00:36:23.950 END TEST bdev_verify 00:36:23.950 ************************************ 00:36:23.950 12:17:37 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:36:23.950 12:17:37 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:36:23.950 12:17:37 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:36:23.950 12:17:37 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:23.950 12:17:37 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:36:23.950 ************************************ 00:36:23.950 START TEST bdev_verify_big_io 00:36:23.950 ************************************ 00:36:23.950 12:17:37 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:36:23.950 [2024-07-15 12:17:37.430855] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:23.950 [2024-07-15 12:17:37.430917] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1663458 ] 00:36:24.209 [2024-07-15 12:17:37.556408] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:36:24.209 [2024-07-15 12:17:37.660488] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:24.209 [2024-07-15 12:17:37.660493] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:24.468 [2024-07-15 12:17:37.825471] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:36:24.468 [2024-07-15 12:17:37.825542] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:24.468 [2024-07-15 12:17:37.825557] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:24.468 [2024-07-15 12:17:37.833493] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:36:24.468 [2024-07-15 12:17:37.833511] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:24.468 [2024-07-15 12:17:37.833522] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:24.468 [2024-07-15 12:17:37.841518] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:36:24.468 [2024-07-15 12:17:37.841535] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:36:24.468 [2024-07-15 12:17:37.841546] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:24.468 Running I/O for 5 seconds... 00:36:29.740 00:36:29.740 Latency(us) 00:36:29.740 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:29.740 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:29.740 Verification LBA range: start 0x0 length 0x80 00:36:29.740 crypto_ram : 5.09 427.55 26.72 0.00 0.00 291777.22 6040.71 413959.57 00:36:29.740 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:29.740 Verification LBA range: start 0x80 length 0x80 00:36:29.740 crypto_ram : 5.25 365.95 22.87 0.00 0.00 339976.41 7465.41 446784.56 00:36:29.740 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:36:29.740 Verification LBA range: start 0x0 length 0x80 00:36:29.740 crypto_ram3 : 5.27 243.05 15.19 0.00 0.00 495874.21 7038.00 446784.56 00:36:29.740 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:36:29.740 Verification LBA range: start 0x80 length 0x80 00:36:29.740 crypto_ram3 : 5.27 194.46 12.15 0.00 0.00 611758.69 8206.25 474138.71 00:36:29.740 =================================================================================================================== 00:36:29.740 Total : 1231.01 76.94 0.00 0.00 398253.41 6040.71 474138.71 00:36:29.998 00:36:29.998 real 0m6.054s 00:36:29.998 user 0m11.404s 00:36:29.998 sys 0m0.234s 00:36:29.998 12:17:43 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:29.998 12:17:43 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:36:29.998 ************************************ 00:36:29.998 END TEST bdev_verify_big_io 00:36:29.998 ************************************ 00:36:29.998 12:17:43 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:36:29.998 12:17:43 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:29.998 12:17:43 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:36:29.998 12:17:43 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:29.998 12:17:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:36:29.998 ************************************ 00:36:29.998 START TEST bdev_write_zeroes 00:36:29.998 ************************************ 00:36:29.998 12:17:43 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:29.998 [2024-07-15 12:17:43.573919] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:29.998 [2024-07-15 12:17:43.573981] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1664314 ] 00:36:30.256 [2024-07-15 12:17:43.703550] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:30.256 [2024-07-15 12:17:43.803657] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:30.514 [2024-07-15 12:17:43.981154] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:36:30.514 [2024-07-15 12:17:43.981230] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:30.514 [2024-07-15 12:17:43.981245] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:30.514 [2024-07-15 12:17:43.989172] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:36:30.514 [2024-07-15 12:17:43.989192] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:30.514 [2024-07-15 12:17:43.989204] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:30.514 [2024-07-15 12:17:43.997194] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:36:30.514 [2024-07-15 12:17:43.997212] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:36:30.514 [2024-07-15 12:17:43.997223] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:30.514 Running I/O for 1 seconds... 00:36:31.892 00:36:31.892 Latency(us) 00:36:31.892 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:31.892 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:31.892 crypto_ram : 1.01 26665.05 104.16 0.00 0.00 4787.59 1296.47 6411.13 00:36:31.892 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:36:31.892 crypto_ram3 : 1.01 13304.78 51.97 0.00 0.00 9554.96 5955.23 9915.88 00:36:31.892 =================================================================================================================== 00:36:31.892 Total : 39969.84 156.13 0.00 0.00 6376.71 1296.47 9915.88 00:36:31.892 00:36:31.892 real 0m1.758s 00:36:31.892 user 0m1.495s 00:36:31.892 sys 0m0.245s 00:36:31.892 12:17:45 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:31.892 12:17:45 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:36:31.892 ************************************ 00:36:31.892 END TEST bdev_write_zeroes 00:36:31.892 ************************************ 00:36:31.892 12:17:45 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:36:31.892 12:17:45 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:31.892 12:17:45 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:36:31.892 12:17:45 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:31.892 12:17:45 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:36:31.892 ************************************ 00:36:31.892 START TEST bdev_json_nonenclosed 00:36:31.892 ************************************ 00:36:31.892 12:17:45 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:31.892 [2024-07-15 12:17:45.462720] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:31.892 [2024-07-15 12:17:45.462846] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1664511 ] 00:36:32.151 [2024-07-15 12:17:45.658337] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:32.410 [2024-07-15 12:17:45.764632] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:32.410 [2024-07-15 12:17:45.764706] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:36:32.410 [2024-07-15 12:17:45.764729] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:36:32.410 [2024-07-15 12:17:45.764741] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:36:32.410 00:36:32.410 real 0m0.513s 00:36:32.410 user 0m0.287s 00:36:32.410 sys 0m0.222s 00:36:32.410 12:17:45 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:36:32.410 12:17:45 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:32.410 12:17:45 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:36:32.410 ************************************ 00:36:32.410 END TEST bdev_json_nonenclosed 00:36:32.410 ************************************ 00:36:32.410 12:17:45 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:36:32.410 12:17:45 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # true 00:36:32.410 12:17:45 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:32.410 12:17:45 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:36:32.410 12:17:45 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:32.410 12:17:45 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:36:32.410 ************************************ 00:36:32.410 START TEST bdev_json_nonarray 00:36:32.410 ************************************ 00:36:32.410 12:17:45 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:36:32.670 [2024-07-15 12:17:46.023501] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:32.670 [2024-07-15 12:17:46.023563] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1664603 ] 00:36:32.670 [2024-07-15 12:17:46.152794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:32.670 [2024-07-15 12:17:46.252914] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:32.670 [2024-07-15 12:17:46.252988] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:36:32.670 [2024-07-15 12:17:46.253010] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:36:32.670 [2024-07-15 12:17:46.253023] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:36:32.929 00:36:32.929 real 0m0.394s 00:36:32.929 user 0m0.243s 00:36:32.929 sys 0m0.148s 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:36:32.929 ************************************ 00:36:32.929 END TEST bdev_json_nonarray 00:36:32.929 ************************************ 00:36:32.929 12:17:46 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 234 00:36:32.929 12:17:46 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # true 00:36:32.929 12:17:46 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:36:32.929 12:17:46 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:36:32.929 12:17:46 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:36:32.929 12:17:46 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:36:32.929 12:17:46 blockdev_crypto_sw -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:36:32.929 12:17:46 blockdev_crypto_sw -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:32.929 12:17:46 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:36:32.929 ************************************ 00:36:32.929 START TEST bdev_crypto_enomem 00:36:32.929 ************************************ 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1123 -- # bdev_crypto_enomem 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=1664723 00:36:32.929 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:36:32.930 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:36:32.930 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 1664723 00:36:32.930 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@829 -- # '[' -z 1664723 ']' 00:36:32.930 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:32.930 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:32.930 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:32.930 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:32.930 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:32.930 12:17:46 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:36:32.930 [2024-07-15 12:17:46.511424] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:32.930 [2024-07-15 12:17:46.511493] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1664723 ] 00:36:33.210 [2024-07-15 12:17:46.657769] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:33.210 [2024-07-15 12:17:46.773348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@862 -- # return 0 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:36:34.149 true 00:36:34.149 base0 00:36:34.149 true 00:36:34.149 [2024-07-15 12:17:47.492990] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:36:34.149 crypt0 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@897 -- # local bdev_name=crypt0 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_timeout= 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local i 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # [[ -z '' ]] 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # bdev_timeout=2000 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # rpc_cmd bdev_wait_for_examine 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:36:34.149 [ 00:36:34.149 { 00:36:34.149 "name": "crypt0", 00:36:34.149 "aliases": [ 00:36:34.149 "fc713f8a-c67b-5e7d-85e9-b6533d2c7a95" 00:36:34.149 ], 00:36:34.149 "product_name": "crypto", 00:36:34.149 "block_size": 512, 00:36:34.149 "num_blocks": 2097152, 00:36:34.149 "uuid": "fc713f8a-c67b-5e7d-85e9-b6533d2c7a95", 00:36:34.149 "assigned_rate_limits": { 00:36:34.149 "rw_ios_per_sec": 0, 00:36:34.149 "rw_mbytes_per_sec": 0, 00:36:34.149 "r_mbytes_per_sec": 0, 00:36:34.149 "w_mbytes_per_sec": 0 00:36:34.149 }, 00:36:34.149 "claimed": false, 00:36:34.149 "zoned": false, 00:36:34.149 "supported_io_types": { 00:36:34.149 "read": true, 00:36:34.149 "write": true, 00:36:34.149 "unmap": false, 00:36:34.149 "flush": false, 00:36:34.149 "reset": true, 00:36:34.149 "nvme_admin": false, 00:36:34.149 "nvme_io": false, 00:36:34.149 "nvme_io_md": false, 00:36:34.149 "write_zeroes": true, 00:36:34.149 "zcopy": false, 00:36:34.149 "get_zone_info": false, 00:36:34.149 "zone_management": false, 00:36:34.149 "zone_append": false, 00:36:34.149 "compare": false, 00:36:34.149 "compare_and_write": false, 00:36:34.149 "abort": false, 00:36:34.149 "seek_hole": false, 00:36:34.149 "seek_data": false, 00:36:34.149 "copy": false, 00:36:34.149 "nvme_iov_md": false 00:36:34.149 }, 00:36:34.149 "memory_domains": [ 00:36:34.149 { 00:36:34.149 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:34.149 "dma_device_type": 2 00:36:34.149 } 00:36:34.149 ], 00:36:34.149 "driver_specific": { 00:36:34.149 "crypto": { 00:36:34.149 "base_bdev_name": "EE_base0", 00:36:34.149 "name": "crypt0", 00:36:34.149 "key_name": "test_dek_sw" 00:36:34.149 } 00:36:34.149 } 00:36:34.149 } 00:36:34.149 ] 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # return 0 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=1664870 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:36:34.149 12:17:47 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:36:34.149 Running I/O for 5 seconds... 00:36:35.086 12:17:48 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:36:35.086 12:17:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:35.086 12:17:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:36:35.086 12:17:48 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:35.086 12:17:48 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 1664870 00:36:39.280 00:36:39.280 Latency(us) 00:36:39.280 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:39.280 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:36:39.280 crypt0 : 5.00 28248.31 110.34 0.00 0.00 1127.92 548.51 1823.61 00:36:39.280 =================================================================================================================== 00:36:39.280 Total : 28248.31 110.34 0.00 0.00 1127.92 548.51 1823.61 00:36:39.280 0 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 1664723 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@948 -- # '[' -z 1664723 ']' 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@952 -- # kill -0 1664723 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # uname 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1664723 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1664723' 00:36:39.280 killing process with pid 1664723 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # kill 1664723 00:36:39.280 Received shutdown signal, test time was about 5.000000 seconds 00:36:39.280 00:36:39.280 Latency(us) 00:36:39.280 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:36:39.280 =================================================================================================================== 00:36:39.280 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:36:39.280 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@972 -- # wait 1664723 00:36:39.540 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:36:39.540 00:36:39.540 real 0m6.517s 00:36:39.540 user 0m6.758s 00:36:39.540 sys 0m0.412s 00:36:39.540 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:39.540 12:17:52 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:36:39.540 ************************************ 00:36:39.540 END TEST bdev_crypto_enomem 00:36:39.540 ************************************ 00:36:39.540 12:17:53 blockdev_crypto_sw -- common/autotest_common.sh@1142 -- # return 0 00:36:39.540 12:17:53 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:36:39.540 12:17:53 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:36:39.540 12:17:53 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:36:39.540 12:17:53 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:36:39.540 12:17:53 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:36:39.540 12:17:53 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:36:39.540 12:17:53 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:36:39.540 12:17:53 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:36:39.540 00:36:39.540 real 0m55.699s 00:36:39.540 user 1m35.449s 00:36:39.540 sys 0m6.949s 00:36:39.540 12:17:53 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:39.540 12:17:53 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:36:39.540 ************************************ 00:36:39.540 END TEST blockdev_crypto_sw 00:36:39.540 ************************************ 00:36:39.540 12:17:53 -- common/autotest_common.sh@1142 -- # return 0 00:36:39.540 12:17:53 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:36:39.540 12:17:53 -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:36:39.540 12:17:53 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:39.540 12:17:53 -- common/autotest_common.sh@10 -- # set +x 00:36:39.540 ************************************ 00:36:39.540 START TEST blockdev_crypto_qat 00:36:39.540 ************************************ 00:36:39.540 12:17:53 blockdev_crypto_qat -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:36:39.807 * Looking for test storage... 00:36:39.807 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1665631 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:36:39.807 12:17:53 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1665631 00:36:39.807 12:17:53 blockdev_crypto_qat -- common/autotest_common.sh@829 -- # '[' -z 1665631 ']' 00:36:39.807 12:17:53 blockdev_crypto_qat -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:39.807 12:17:53 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:39.807 12:17:53 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:39.807 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:39.807 12:17:53 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:39.807 12:17:53 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:39.807 [2024-07-15 12:17:53.297014] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:39.807 [2024-07-15 12:17:53.297069] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1665631 ] 00:36:40.069 [2024-07-15 12:17:53.401300] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:40.069 [2024-07-15 12:17:53.500440] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:40.636 12:17:54 blockdev_crypto_qat -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:40.636 12:17:54 blockdev_crypto_qat -- common/autotest_common.sh@862 -- # return 0 00:36:40.636 12:17:54 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:36:40.636 12:17:54 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:36:40.636 12:17:54 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:36:40.636 12:17:54 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:40.636 12:17:54 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:40.636 [2024-07-15 12:17:54.182670] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:40.636 [2024-07-15 12:17:54.190708] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:40.636 [2024-07-15 12:17:54.198736] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:40.894 [2024-07-15 12:17:54.265122] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:43.429 true 00:36:43.429 true 00:36:43.429 true 00:36:43.429 true 00:36:43.429 Malloc0 00:36:43.429 Malloc1 00:36:43.429 Malloc2 00:36:43.429 Malloc3 00:36:43.429 [2024-07-15 12:17:56.666810] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:43.429 crypto_ram 00:36:43.429 [2024-07-15 12:17:56.674825] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:43.429 crypto_ram1 00:36:43.429 [2024-07-15 12:17:56.682843] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:43.429 crypto_ram2 00:36:43.429 [2024-07-15 12:17:56.690863] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:43.429 crypto_ram3 00:36:43.429 [ 00:36:43.429 { 00:36:43.429 "name": "Malloc1", 00:36:43.429 "aliases": [ 00:36:43.429 "e267f40f-5704-4482-8b0d-d705be79b676" 00:36:43.429 ], 00:36:43.429 "product_name": "Malloc disk", 00:36:43.429 "block_size": 512, 00:36:43.429 "num_blocks": 65536, 00:36:43.429 "uuid": "e267f40f-5704-4482-8b0d-d705be79b676", 00:36:43.429 "assigned_rate_limits": { 00:36:43.429 "rw_ios_per_sec": 0, 00:36:43.429 "rw_mbytes_per_sec": 0, 00:36:43.429 "r_mbytes_per_sec": 0, 00:36:43.429 "w_mbytes_per_sec": 0 00:36:43.429 }, 00:36:43.429 "claimed": true, 00:36:43.429 "claim_type": "exclusive_write", 00:36:43.429 "zoned": false, 00:36:43.429 "supported_io_types": { 00:36:43.429 "read": true, 00:36:43.429 "write": true, 00:36:43.429 "unmap": true, 00:36:43.429 "flush": true, 00:36:43.429 "reset": true, 00:36:43.429 "nvme_admin": false, 00:36:43.429 "nvme_io": false, 00:36:43.429 "nvme_io_md": false, 00:36:43.429 "write_zeroes": true, 00:36:43.429 "zcopy": true, 00:36:43.429 "get_zone_info": false, 00:36:43.429 "zone_management": false, 00:36:43.429 "zone_append": false, 00:36:43.429 "compare": false, 00:36:43.429 "compare_and_write": false, 00:36:43.429 "abort": true, 00:36:43.429 "seek_hole": false, 00:36:43.429 "seek_data": false, 00:36:43.429 "copy": true, 00:36:43.429 "nvme_iov_md": false 00:36:43.429 }, 00:36:43.429 "memory_domains": [ 00:36:43.429 { 00:36:43.429 "dma_device_id": "system", 00:36:43.430 "dma_device_type": 1 00:36:43.430 }, 00:36:43.430 { 00:36:43.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:36:43.430 "dma_device_type": 2 00:36:43.430 } 00:36:43.430 ], 00:36:43.430 "driver_specific": {} 00:36:43.430 } 00:36:43.430 ] 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@559 -- # xtrace_disable 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "734f8864-3be3-5670-b43b-682df027f066"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "734f8864-3be3-5670-b43b-682df027f066",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "7d552568-b23a-5a73-a1f6-3b7aee979431"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7d552568-b23a-5a73-a1f6-3b7aee979431",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ceacbdac-deac-5cef-8b48-e23a54240dd5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ceacbdac-deac-5cef-8b48-e23a54240dd5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "4f09cb8a-b5a9-57e8-96be-04669d37ef98"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4f09cb8a-b5a9-57e8-96be-04669d37ef98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:36:43.430 12:17:56 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 1665631 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@948 -- # '[' -z 1665631 ']' 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@952 -- # kill -0 1665631 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # uname 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1665631 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1665631' 00:36:43.430 killing process with pid 1665631 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # kill 1665631 00:36:43.430 12:17:56 blockdev_crypto_qat -- common/autotest_common.sh@972 -- # wait 1665631 00:36:43.999 12:17:57 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:36:43.999 12:17:57 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:36:43.999 12:17:57 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 7 -le 1 ']' 00:36:43.999 12:17:57 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:43.999 12:17:57 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:44.258 ************************************ 00:36:44.258 START TEST bdev_hello_world 00:36:44.258 ************************************ 00:36:44.258 12:17:57 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:36:44.258 [2024-07-15 12:17:57.690068] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:44.258 [2024-07-15 12:17:57.690135] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1666203 ] 00:36:44.258 [2024-07-15 12:17:57.817475] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:44.518 [2024-07-15 12:17:57.924542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:44.518 [2024-07-15 12:17:57.945882] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:44.518 [2024-07-15 12:17:57.953911] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:44.518 [2024-07-15 12:17:57.961929] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:44.518 [2024-07-15 12:17:58.080915] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:47.071 [2024-07-15 12:18:00.308745] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:47.071 [2024-07-15 12:18:00.308822] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:47.071 [2024-07-15 12:18:00.308838] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:47.072 [2024-07-15 12:18:00.316757] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:47.072 [2024-07-15 12:18:00.316778] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:47.072 [2024-07-15 12:18:00.316790] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:47.072 [2024-07-15 12:18:00.324778] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:47.072 [2024-07-15 12:18:00.324796] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:47.072 [2024-07-15 12:18:00.324808] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:47.072 [2024-07-15 12:18:00.332797] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:47.072 [2024-07-15 12:18:00.332815] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:47.072 [2024-07-15 12:18:00.332826] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:47.072 [2024-07-15 12:18:00.409812] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:36:47.072 [2024-07-15 12:18:00.409859] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:36:47.072 [2024-07-15 12:18:00.409879] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:36:47.072 [2024-07-15 12:18:00.411189] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:36:47.072 [2024-07-15 12:18:00.411268] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:36:47.072 [2024-07-15 12:18:00.411285] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:36:47.072 [2024-07-15 12:18:00.411332] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:36:47.072 00:36:47.072 [2024-07-15 12:18:00.411350] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:36:47.330 00:36:47.330 real 0m3.198s 00:36:47.330 user 0m2.743s 00:36:47.330 sys 0m0.402s 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:36:47.330 ************************************ 00:36:47.330 END TEST bdev_hello_world 00:36:47.330 ************************************ 00:36:47.330 12:18:00 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:47.330 12:18:00 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:36:47.330 12:18:00 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:36:47.330 12:18:00 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:47.330 12:18:00 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:47.330 ************************************ 00:36:47.330 START TEST bdev_bounds 00:36:47.330 ************************************ 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1123 -- # bdev_bounds '' 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=1666632 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 1666632' 00:36:47.330 Process bdevio pid: 1666632 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 1666632 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@829 -- # '[' -z 1666632 ']' 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:36:47.330 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:47.330 12:18:00 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:36:47.588 [2024-07-15 12:18:00.973082] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:47.588 [2024-07-15 12:18:00.973153] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1666632 ] 00:36:47.588 [2024-07-15 12:18:01.104216] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:36:47.846 [2024-07-15 12:18:01.211342] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:36:47.846 [2024-07-15 12:18:01.211430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:47.846 [2024-07-15 12:18:01.211430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:36:47.846 [2024-07-15 12:18:01.232960] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:47.846 [2024-07-15 12:18:01.240985] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:47.846 [2024-07-15 12:18:01.249005] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:47.846 [2024-07-15 12:18:01.356014] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:50.414 [2024-07-15 12:18:03.564589] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:50.414 [2024-07-15 12:18:03.564679] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:50.414 [2024-07-15 12:18:03.564699] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:50.414 [2024-07-15 12:18:03.572607] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:50.414 [2024-07-15 12:18:03.572626] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:50.414 [2024-07-15 12:18:03.572637] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:50.414 [2024-07-15 12:18:03.580631] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:50.414 [2024-07-15 12:18:03.580648] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:50.414 [2024-07-15 12:18:03.580664] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:50.414 [2024-07-15 12:18:03.588652] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:50.414 [2024-07-15 12:18:03.588670] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:50.414 [2024-07-15 12:18:03.588681] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:50.414 12:18:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:50.414 12:18:03 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@862 -- # return 0 00:36:50.414 12:18:03 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:36:50.674 I/O targets: 00:36:50.674 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:36:50.674 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:36:50.674 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:36:50.674 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:36:50.674 00:36:50.674 00:36:50.674 CUnit - A unit testing framework for C - Version 2.1-3 00:36:50.674 http://cunit.sourceforge.net/ 00:36:50.674 00:36:50.674 00:36:50.674 Suite: bdevio tests on: crypto_ram3 00:36:50.674 Test: blockdev write read block ...passed 00:36:50.674 Test: blockdev write zeroes read block ...passed 00:36:50.674 Test: blockdev write zeroes read no split ...passed 00:36:50.674 Test: blockdev write zeroes read split ...passed 00:36:50.674 Test: blockdev write zeroes read split partial ...passed 00:36:50.674 Test: blockdev reset ...passed 00:36:50.674 Test: blockdev write read 8 blocks ...passed 00:36:50.674 Test: blockdev write read size > 128k ...passed 00:36:50.674 Test: blockdev write read invalid size ...passed 00:36:50.674 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:50.674 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:50.674 Test: blockdev write read max offset ...passed 00:36:50.674 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:50.674 Test: blockdev writev readv 8 blocks ...passed 00:36:50.674 Test: blockdev writev readv 30 x 1block ...passed 00:36:50.674 Test: blockdev writev readv block ...passed 00:36:50.674 Test: blockdev writev readv size > 128k ...passed 00:36:50.674 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:50.674 Test: blockdev comparev and writev ...passed 00:36:50.675 Test: blockdev nvme passthru rw ...passed 00:36:50.675 Test: blockdev nvme passthru vendor specific ...passed 00:36:50.675 Test: blockdev nvme admin passthru ...passed 00:36:50.675 Test: blockdev copy ...passed 00:36:50.675 Suite: bdevio tests on: crypto_ram2 00:36:50.675 Test: blockdev write read block ...passed 00:36:50.675 Test: blockdev write zeroes read block ...passed 00:36:50.675 Test: blockdev write zeroes read no split ...passed 00:36:50.675 Test: blockdev write zeroes read split ...passed 00:36:50.675 Test: blockdev write zeroes read split partial ...passed 00:36:50.675 Test: blockdev reset ...passed 00:36:50.675 Test: blockdev write read 8 blocks ...passed 00:36:50.675 Test: blockdev write read size > 128k ...passed 00:36:50.675 Test: blockdev write read invalid size ...passed 00:36:50.675 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:50.675 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:50.675 Test: blockdev write read max offset ...passed 00:36:50.675 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:50.675 Test: blockdev writev readv 8 blocks ...passed 00:36:50.675 Test: blockdev writev readv 30 x 1block ...passed 00:36:50.675 Test: blockdev writev readv block ...passed 00:36:50.675 Test: blockdev writev readv size > 128k ...passed 00:36:50.675 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:50.675 Test: blockdev comparev and writev ...passed 00:36:50.675 Test: blockdev nvme passthru rw ...passed 00:36:50.675 Test: blockdev nvme passthru vendor specific ...passed 00:36:50.675 Test: blockdev nvme admin passthru ...passed 00:36:50.675 Test: blockdev copy ...passed 00:36:50.675 Suite: bdevio tests on: crypto_ram1 00:36:50.675 Test: blockdev write read block ...passed 00:36:50.675 Test: blockdev write zeroes read block ...passed 00:36:50.675 Test: blockdev write zeroes read no split ...passed 00:36:50.934 Test: blockdev write zeroes read split ...passed 00:36:50.934 Test: blockdev write zeroes read split partial ...passed 00:36:50.935 Test: blockdev reset ...passed 00:36:50.935 Test: blockdev write read 8 blocks ...passed 00:36:50.935 Test: blockdev write read size > 128k ...passed 00:36:50.935 Test: blockdev write read invalid size ...passed 00:36:50.935 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:50.935 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:50.935 Test: blockdev write read max offset ...passed 00:36:50.935 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:50.935 Test: blockdev writev readv 8 blocks ...passed 00:36:50.935 Test: blockdev writev readv 30 x 1block ...passed 00:36:50.935 Test: blockdev writev readv block ...passed 00:36:50.935 Test: blockdev writev readv size > 128k ...passed 00:36:50.935 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:50.935 Test: blockdev comparev and writev ...passed 00:36:50.935 Test: blockdev nvme passthru rw ...passed 00:36:50.935 Test: blockdev nvme passthru vendor specific ...passed 00:36:50.935 Test: blockdev nvme admin passthru ...passed 00:36:50.935 Test: blockdev copy ...passed 00:36:50.935 Suite: bdevio tests on: crypto_ram 00:36:50.935 Test: blockdev write read block ...passed 00:36:50.935 Test: blockdev write zeroes read block ...passed 00:36:51.193 Test: blockdev write zeroes read no split ...passed 00:36:51.193 Test: blockdev write zeroes read split ...passed 00:36:51.452 Test: blockdev write zeroes read split partial ...passed 00:36:51.452 Test: blockdev reset ...passed 00:36:51.452 Test: blockdev write read 8 blocks ...passed 00:36:51.452 Test: blockdev write read size > 128k ...passed 00:36:51.452 Test: blockdev write read invalid size ...passed 00:36:51.452 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:36:51.452 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:36:51.452 Test: blockdev write read max offset ...passed 00:36:51.452 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:36:51.452 Test: blockdev writev readv 8 blocks ...passed 00:36:51.452 Test: blockdev writev readv 30 x 1block ...passed 00:36:51.452 Test: blockdev writev readv block ...passed 00:36:51.452 Test: blockdev writev readv size > 128k ...passed 00:36:51.452 Test: blockdev writev readv size > 128k in two iovs ...passed 00:36:51.452 Test: blockdev comparev and writev ...passed 00:36:51.452 Test: blockdev nvme passthru rw ...passed 00:36:51.452 Test: blockdev nvme passthru vendor specific ...passed 00:36:51.452 Test: blockdev nvme admin passthru ...passed 00:36:51.452 Test: blockdev copy ...passed 00:36:51.452 00:36:51.452 Run Summary: Type Total Ran Passed Failed Inactive 00:36:51.452 suites 4 4 n/a 0 0 00:36:51.452 tests 92 92 92 0 0 00:36:51.452 asserts 520 520 520 0 n/a 00:36:51.452 00:36:51.452 Elapsed time = 1.574 seconds 00:36:51.452 0 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 1666632 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@948 -- # '[' -z 1666632 ']' 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@952 -- # kill -0 1666632 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # uname 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1666632 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1666632' 00:36:51.452 killing process with pid 1666632 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # kill 1666632 00:36:51.452 12:18:04 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@972 -- # wait 1666632 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:36:52.016 00:36:52.016 real 0m4.402s 00:36:52.016 user 0m12.109s 00:36:52.016 sys 0m0.608s 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # xtrace_disable 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:36:52.016 ************************************ 00:36:52.016 END TEST bdev_bounds 00:36:52.016 ************************************ 00:36:52.016 12:18:05 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:36:52.016 12:18:05 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:36:52.016 12:18:05 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 5 -le 1 ']' 00:36:52.016 12:18:05 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:36:52.016 12:18:05 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:36:52.016 ************************************ 00:36:52.016 START TEST bdev_nbd 00:36:52.016 ************************************ 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1123 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=1667514 00:36:52.016 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:36:52.017 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 1667514 /var/tmp/spdk-nbd.sock 00:36:52.017 12:18:05 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:36:52.017 12:18:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@829 -- # '[' -z 1667514 ']' 00:36:52.017 12:18:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:36:52.017 12:18:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local max_retries=100 00:36:52.017 12:18:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:36:52.017 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:36:52.017 12:18:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # xtrace_disable 00:36:52.017 12:18:05 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:36:52.017 [2024-07-15 12:18:05.517603] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:36:52.017 [2024-07-15 12:18:05.517762] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:36:52.274 [2024-07-15 12:18:05.712470] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:36:52.275 [2024-07-15 12:18:05.814788] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:36:52.275 [2024-07-15 12:18:05.836068] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:36:52.275 [2024-07-15 12:18:05.844086] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:36:52.275 [2024-07-15 12:18:05.852103] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:36:52.532 [2024-07-15 12:18:05.964229] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:36:55.063 [2024-07-15 12:18:08.170976] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:36:55.063 [2024-07-15 12:18:08.171042] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:36:55.063 [2024-07-15 12:18:08.171057] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:55.063 [2024-07-15 12:18:08.178998] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:36:55.063 [2024-07-15 12:18:08.179017] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:36:55.063 [2024-07-15 12:18:08.179028] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:55.063 [2024-07-15 12:18:08.187018] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:36:55.063 [2024-07-15 12:18:08.187035] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:36:55.063 [2024-07-15 12:18:08.187046] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:55.063 [2024-07-15 12:18:08.195038] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:36:55.063 [2024-07-15 12:18:08.195055] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:36:55.063 [2024-07-15 12:18:08.195066] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@862 -- # return 0 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:55.063 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:36:55.321 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:36:55.321 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:36:55.321 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:36:55.321 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:36:55.321 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:55.321 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:55.321 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:55.322 1+0 records in 00:36:55.322 1+0 records out 00:36:55.322 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300564 s, 13.6 MB/s 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:55.322 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:36:55.580 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:36:55.580 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:36:55.580 12:18:08 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:36:55.580 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:36:55.580 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:55.580 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:55.580 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:55.580 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:36:55.580 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:55.581 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:55.581 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:55.581 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:55.581 1+0 records in 00:36:55.581 1+0 records out 00:36:55.581 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000306975 s, 13.3 MB/s 00:36:55.581 12:18:08 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:55.581 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:55.581 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:55.581 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:55.581 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:55.581 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:55.581 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:55.581 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd2 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd2 /proc/partitions 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:55.838 1+0 records in 00:36:55.838 1+0 records out 00:36:55.838 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00034777 s, 11.8 MB/s 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:55.838 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd3 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd3 /proc/partitions 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:56.096 1+0 records in 00:36:56.096 1+0 records out 00:36:56.096 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00040916 s, 10.0 MB/s 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:36:56.096 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:36:56.354 { 00:36:56.354 "nbd_device": "/dev/nbd0", 00:36:56.354 "bdev_name": "crypto_ram" 00:36:56.354 }, 00:36:56.354 { 00:36:56.354 "nbd_device": "/dev/nbd1", 00:36:56.354 "bdev_name": "crypto_ram1" 00:36:56.354 }, 00:36:56.354 { 00:36:56.354 "nbd_device": "/dev/nbd2", 00:36:56.354 "bdev_name": "crypto_ram2" 00:36:56.354 }, 00:36:56.354 { 00:36:56.354 "nbd_device": "/dev/nbd3", 00:36:56.354 "bdev_name": "crypto_ram3" 00:36:56.354 } 00:36:56.354 ]' 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:36:56.354 { 00:36:56.354 "nbd_device": "/dev/nbd0", 00:36:56.354 "bdev_name": "crypto_ram" 00:36:56.354 }, 00:36:56.354 { 00:36:56.354 "nbd_device": "/dev/nbd1", 00:36:56.354 "bdev_name": "crypto_ram1" 00:36:56.354 }, 00:36:56.354 { 00:36:56.354 "nbd_device": "/dev/nbd2", 00:36:56.354 "bdev_name": "crypto_ram2" 00:36:56.354 }, 00:36:56.354 { 00:36:56.354 "nbd_device": "/dev/nbd3", 00:36:56.354 "bdev_name": "crypto_ram3" 00:36:56.354 } 00:36:56.354 ]' 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:56.354 12:18:09 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:56.613 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:56.873 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:36:57.440 12:18:10 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:58.007 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:58.266 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:36:58.524 /dev/nbd0 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd0 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd0 /proc/partitions 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:58.524 1+0 records in 00:36:58.524 1+0 records out 00:36:58.524 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000304276 s, 13.5 MB/s 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:58.524 12:18:11 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:36:58.783 /dev/nbd1 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd1 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd1 /proc/partitions 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:58.783 1+0 records in 00:36:58.783 1+0 records out 00:36:58.783 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000300884 s, 13.6 MB/s 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:58.783 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:36:59.042 /dev/nbd10 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd10 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd10 /proc/partitions 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:59.042 1+0 records in 00:36:59.042 1+0 records out 00:36:59.042 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00035454 s, 11.6 MB/s 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:59.042 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:36:59.301 /dev/nbd11 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@866 -- # local nbd_name=nbd11 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local i 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i = 1 )) 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # (( i <= 20 )) 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # grep -q -w nbd11 /proc/partitions 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # break 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i = 1 )) 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@882 -- # (( i <= 20 )) 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:36:59.301 1+0 records in 00:36:59.301 1+0 records out 00:36:59.301 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000352586 s, 11.6 MB/s 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # size=4096 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # '[' 4096 '!=' 0 ']' 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # return 0 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:36:59.301 12:18:12 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:36:59.590 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:36:59.590 { 00:36:59.590 "nbd_device": "/dev/nbd0", 00:36:59.590 "bdev_name": "crypto_ram" 00:36:59.590 }, 00:36:59.590 { 00:36:59.590 "nbd_device": "/dev/nbd1", 00:36:59.590 "bdev_name": "crypto_ram1" 00:36:59.590 }, 00:36:59.590 { 00:36:59.590 "nbd_device": "/dev/nbd10", 00:36:59.590 "bdev_name": "crypto_ram2" 00:36:59.590 }, 00:36:59.590 { 00:36:59.590 "nbd_device": "/dev/nbd11", 00:36:59.590 "bdev_name": "crypto_ram3" 00:36:59.590 } 00:36:59.590 ]' 00:36:59.590 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:36:59.590 { 00:36:59.590 "nbd_device": "/dev/nbd0", 00:36:59.590 "bdev_name": "crypto_ram" 00:36:59.590 }, 00:36:59.590 { 00:36:59.590 "nbd_device": "/dev/nbd1", 00:36:59.590 "bdev_name": "crypto_ram1" 00:36:59.590 }, 00:36:59.590 { 00:36:59.590 "nbd_device": "/dev/nbd10", 00:36:59.590 "bdev_name": "crypto_ram2" 00:36:59.590 }, 00:36:59.590 { 00:36:59.590 "nbd_device": "/dev/nbd11", 00:36:59.590 "bdev_name": "crypto_ram3" 00:36:59.590 } 00:36:59.590 ]' 00:36:59.590 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:36:59.590 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:36:59.590 /dev/nbd1 00:36:59.590 /dev/nbd10 00:36:59.590 /dev/nbd11' 00:36:59.881 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:36:59.881 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:36:59.881 /dev/nbd1 00:36:59.881 /dev/nbd10 00:36:59.881 /dev/nbd11' 00:36:59.881 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:36:59.881 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:36:59.881 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:36:59.882 256+0 records in 00:36:59.882 256+0 records out 00:36:59.882 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0107705 s, 97.4 MB/s 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:36:59.882 256+0 records in 00:36:59.882 256+0 records out 00:36:59.882 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0834161 s, 12.6 MB/s 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:36:59.882 256+0 records in 00:36:59.882 256+0 records out 00:36:59.882 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.065385 s, 16.0 MB/s 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:36:59.882 256+0 records in 00:36:59.882 256+0 records out 00:36:59.882 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0584988 s, 17.9 MB/s 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:36:59.882 256+0 records in 00:36:59.882 256+0 records out 00:36:59.882 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0558877 s, 18.8 MB/s 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:36:59.882 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:00.141 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:00.399 12:18:13 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:00.658 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:00.916 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:37:01.176 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:37:01.434 12:18:14 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:37:01.693 malloc_lvol_verify 00:37:01.693 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:37:01.693 8a08d541-725d-45df-9243-7f2d8ad8e95a 00:37:01.952 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:37:01.952 3220e697-d366-4bfa-ae4c-5d9da23f6f35 00:37:02.209 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:37:02.209 /dev/nbd0 00:37:02.468 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:37:02.468 mke2fs 1.46.5 (30-Dec-2021) 00:37:02.468 Discarding device blocks: 0/4096 done 00:37:02.468 Creating filesystem with 4096 1k blocks and 1024 inodes 00:37:02.468 00:37:02.468 Allocating group tables: 0/1 done 00:37:02.468 Writing inode tables: 0/1 done 00:37:02.468 Creating journal (1024 blocks): done 00:37:02.468 Writing superblocks and filesystem accounting information: 0/1 done 00:37:02.468 00:37:02.468 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:37:02.468 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:37:02.468 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:37:02.468 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:37:02.468 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:37:02.468 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:37:02.468 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:37:02.468 12:18:15 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 1667514 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@948 -- # '[' -z 1667514 ']' 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@952 -- # kill -0 1667514 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # uname 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1667514 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1667514' 00:37:02.726 killing process with pid 1667514 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # kill 1667514 00:37:02.726 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@972 -- # wait 1667514 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:37:03.293 00:37:03.293 real 0m11.185s 00:37:03.293 user 0m14.740s 00:37:03.293 sys 0m4.529s 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:37:03.293 ************************************ 00:37:03.293 END TEST bdev_nbd 00:37:03.293 ************************************ 00:37:03.293 12:18:16 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:37:03.293 12:18:16 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:37:03.293 12:18:16 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:37:03.293 12:18:16 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:37:03.293 12:18:16 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:37:03.293 12:18:16 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 3 -le 1 ']' 00:37:03.293 12:18:16 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:03.293 12:18:16 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:03.293 ************************************ 00:37:03.293 START TEST bdev_fio 00:37:03.293 ************************************ 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1123 -- # fio_test_suite '' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:37:03.293 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:37:03.293 ************************************ 00:37:03.293 START TEST bdev_fio_rw_verify 00:37:03.293 ************************************ 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:37:03.293 12:18:16 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:03.861 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:03.861 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:03.861 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:03.861 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:03.861 fio-3.35 00:37:03.861 Starting 4 threads 00:37:18.743 00:37:18.743 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1669719: Mon Jul 15 12:18:29 2024 00:37:18.743 read: IOPS=26.3k, BW=103MiB/s (108MB/s)(1026MiB/10001msec) 00:37:18.743 slat (usec): min=11, max=1450, avg=53.07, stdev=43.17 00:37:18.743 clat (usec): min=17, max=3465, avg=295.06, stdev=239.37 00:37:18.744 lat (usec): min=41, max=3543, avg=348.13, stdev=268.51 00:37:18.744 clat percentiles (usec): 00:37:18.744 | 50.000th=[ 221], 99.000th=[ 1287], 99.900th=[ 1500], 99.990th=[ 1778], 00:37:18.744 | 99.999th=[ 2606] 00:37:18.744 write: IOPS=28.8k, BW=112MiB/s (118MB/s)(1094MiB/9727msec); 0 zone resets 00:37:18.744 slat (usec): min=13, max=523, avg=62.92, stdev=42.42 00:37:18.744 clat (usec): min=19, max=1876, avg=327.78, stdev=243.15 00:37:18.744 lat (usec): min=39, max=2076, avg=390.70, stdev=271.27 00:37:18.744 clat percentiles (usec): 00:37:18.744 | 50.000th=[ 265], 99.000th=[ 1352], 99.900th=[ 1565], 99.990th=[ 1647], 00:37:18.744 | 99.999th=[ 1729] 00:37:18.744 bw ( KiB/s): min=99296, max=142365, per=97.66%, avg=112463.42, stdev=2494.89, samples=76 00:37:18.744 iops : min=24824, max=35591, avg=28115.84, stdev=623.71, samples=76 00:37:18.744 lat (usec) : 20=0.01%, 50=0.05%, 100=8.07%, 250=44.28%, 500=33.79% 00:37:18.744 lat (usec) : 750=7.78%, 1000=2.98% 00:37:18.744 lat (msec) : 2=3.04%, 4=0.01% 00:37:18.744 cpu : usr=99.62%, sys=0.00%, ctx=95, majf=0, minf=270 00:37:18.744 IO depths : 1=1.5%, 2=28.2%, 4=56.3%, 8=14.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:37:18.744 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:37:18.744 complete : 0=0.0%, 4=87.7%, 8=12.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:37:18.744 issued rwts: total=262632,280042,0,0 short=0,0,0,0 dropped=0,0,0,0 00:37:18.744 latency : target=0, window=0, percentile=100.00%, depth=8 00:37:18.744 00:37:18.744 Run status group 0 (all jobs): 00:37:18.744 READ: bw=103MiB/s (108MB/s), 103MiB/s-103MiB/s (108MB/s-108MB/s), io=1026MiB (1076MB), run=10001-10001msec 00:37:18.744 WRITE: bw=112MiB/s (118MB/s), 112MiB/s-112MiB/s (118MB/s-118MB/s), io=1094MiB (1147MB), run=9727-9727msec 00:37:18.744 00:37:18.744 real 0m13.582s 00:37:18.744 user 0m46.108s 00:37:18.744 sys 0m0.498s 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:37:18.744 ************************************ 00:37:18.744 END TEST bdev_fio_rw_verify 00:37:18.744 ************************************ 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "734f8864-3be3-5670-b43b-682df027f066"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "734f8864-3be3-5670-b43b-682df027f066",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "7d552568-b23a-5a73-a1f6-3b7aee979431"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7d552568-b23a-5a73-a1f6-3b7aee979431",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ceacbdac-deac-5cef-8b48-e23a54240dd5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ceacbdac-deac-5cef-8b48-e23a54240dd5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "4f09cb8a-b5a9-57e8-96be-04669d37ef98"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4f09cb8a-b5a9-57e8-96be-04669d37ef98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:37:18.744 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:37:18.744 crypto_ram1 00:37:18.744 crypto_ram2 00:37:18.744 crypto_ram3 ]] 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "734f8864-3be3-5670-b43b-682df027f066"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "734f8864-3be3-5670-b43b-682df027f066",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "7d552568-b23a-5a73-a1f6-3b7aee979431"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "7d552568-b23a-5a73-a1f6-3b7aee979431",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "ceacbdac-deac-5cef-8b48-e23a54240dd5"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "ceacbdac-deac-5cef-8b48-e23a54240dd5",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "4f09cb8a-b5a9-57e8-96be-04669d37ef98"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "4f09cb8a-b5a9-57e8-96be-04669d37ef98",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1099 -- # '[' 11 -le 1 ']' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:37:18.745 ************************************ 00:37:18.745 START TEST bdev_fio_trim 00:37:18.745 ************************************ 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1123 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:37:18.745 12:18:30 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:37:18.745 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:18.745 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:18.745 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:18.745 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:37:18.745 fio-3.35 00:37:18.745 Starting 4 threads 00:37:30.953 00:37:30.953 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1671543: Mon Jul 15 12:18:43 2024 00:37:30.953 write: IOPS=30.7k, BW=120MiB/s (126MB/s)(1201MiB/10001msec); 0 zone resets 00:37:30.953 slat (usec): min=17, max=439, avg=77.08, stdev=30.80 00:37:30.953 clat (usec): min=23, max=1842, avg=272.30, stdev=148.27 00:37:30.953 lat (usec): min=41, max=1944, avg=349.38, stdev=161.94 00:37:30.953 clat percentiles (usec): 00:37:30.953 | 50.000th=[ 249], 99.000th=[ 676], 99.900th=[ 840], 99.990th=[ 930], 00:37:30.953 | 99.999th=[ 1516] 00:37:30.953 bw ( KiB/s): min=84672, max=189728, per=100.00%, avg=123031.58, stdev=7638.59, samples=76 00:37:30.953 iops : min=21168, max=47432, avg=30757.89, stdev=1909.65, samples=76 00:37:30.953 trim: IOPS=30.7k, BW=120MiB/s (126MB/s)(1201MiB/10001msec); 0 zone resets 00:37:30.953 slat (usec): min=5, max=425, avg=20.99, stdev=10.17 00:37:30.953 clat (usec): min=41, max=1944, avg=349.56, stdev=161.97 00:37:30.953 lat (usec): min=47, max=1964, avg=370.55, stdev=166.23 00:37:30.953 clat percentiles (usec): 00:37:30.953 | 50.000th=[ 326], 99.000th=[ 799], 99.900th=[ 1020], 99.990th=[ 1123], 00:37:30.953 | 99.999th=[ 1582] 00:37:30.953 bw ( KiB/s): min=84672, max=189728, per=100.00%, avg=123031.58, stdev=7638.59, samples=76 00:37:30.953 iops : min=21168, max=47432, avg=30757.89, stdev=1909.65, samples=76 00:37:30.953 lat (usec) : 50=0.09%, 100=5.43%, 250=35.69%, 500=45.78%, 750=11.97% 00:37:30.953 lat (usec) : 1000=0.98% 00:37:30.953 lat (msec) : 2=0.07% 00:37:30.953 cpu : usr=99.59%, sys=0.00%, ctx=72, majf=0, minf=113 00:37:30.953 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:37:30.953 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:37:30.953 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:37:30.953 issued rwts: total=0,307423,307424,0 short=0,0,0,0 dropped=0,0,0,0 00:37:30.953 latency : target=0, window=0, percentile=100.00%, depth=8 00:37:30.953 00:37:30.953 Run status group 0 (all jobs): 00:37:30.953 WRITE: bw=120MiB/s (126MB/s), 120MiB/s-120MiB/s (126MB/s-126MB/s), io=1201MiB (1259MB), run=10001-10001msec 00:37:30.953 TRIM: bw=120MiB/s (126MB/s), 120MiB/s-120MiB/s (126MB/s-126MB/s), io=1201MiB (1259MB), run=10001-10001msec 00:37:30.953 00:37:30.953 real 0m13.590s 00:37:30.953 user 0m45.951s 00:37:30.953 sys 0m0.521s 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:37:30.953 ************************************ 00:37:30.953 END TEST bdev_fio_trim 00:37:30.953 ************************************ 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1142 -- # return 0 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:37:30.953 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:37:30.953 00:37:30.953 real 0m27.549s 00:37:30.953 user 1m32.254s 00:37:30.953 sys 0m1.222s 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:37:30.953 ************************************ 00:37:30.953 END TEST bdev_fio 00:37:30.953 ************************************ 00:37:30.953 12:18:44 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:37:30.953 12:18:44 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:37:30.953 12:18:44 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:37:30.953 12:18:44 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:37:30.953 12:18:44 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:30.953 12:18:44 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:30.953 ************************************ 00:37:30.953 START TEST bdev_verify 00:37:30.953 ************************************ 00:37:30.953 12:18:44 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:37:30.953 [2024-07-15 12:18:44.367697] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:37:30.953 [2024-07-15 12:18:44.367760] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1672956 ] 00:37:30.953 [2024-07-15 12:18:44.494926] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:37:31.212 [2024-07-15 12:18:44.592859] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:31.212 [2024-07-15 12:18:44.592865] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:31.212 [2024-07-15 12:18:44.614246] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:37:31.212 [2024-07-15 12:18:44.622268] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:37:31.212 [2024-07-15 12:18:44.630291] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:37:31.212 [2024-07-15 12:18:44.727341] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:37:33.747 [2024-07-15 12:18:46.923795] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:37:33.747 [2024-07-15 12:18:46.923872] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:33.747 [2024-07-15 12:18:46.923887] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:33.747 [2024-07-15 12:18:46.931812] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:37:33.747 [2024-07-15 12:18:46.931831] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:33.747 [2024-07-15 12:18:46.931842] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:33.747 [2024-07-15 12:18:46.939836] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:37:33.747 [2024-07-15 12:18:46.939854] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:37:33.747 [2024-07-15 12:18:46.939865] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:33.747 [2024-07-15 12:18:46.947860] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:37:33.747 [2024-07-15 12:18:46.947876] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:37:33.747 [2024-07-15 12:18:46.947892] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:33.747 Running I/O for 5 seconds... 00:37:39.043 00:37:39.043 Latency(us) 00:37:39.043 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:39.043 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:39.043 Verification LBA range: start 0x0 length 0x1000 00:37:39.043 crypto_ram : 5.06 480.35 1.88 0.00 0.00 265928.46 7009.50 160477.72 00:37:39.043 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:39.043 Verification LBA range: start 0x1000 length 0x1000 00:37:39.043 crypto_ram : 5.07 403.65 1.58 0.00 0.00 313569.35 5157.40 175978.41 00:37:39.043 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:39.043 Verification LBA range: start 0x0 length 0x1000 00:37:39.043 crypto_ram1 : 5.06 480.24 1.88 0.00 0.00 265186.55 7522.39 148624.25 00:37:39.043 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:39.043 Verification LBA range: start 0x1000 length 0x1000 00:37:39.043 crypto_ram1 : 5.07 397.58 1.55 0.00 0.00 320707.07 5470.83 200597.15 00:37:39.043 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:39.043 Verification LBA range: start 0x0 length 0x1000 00:37:39.043 crypto_ram2 : 5.04 3731.05 14.57 0.00 0.00 34003.23 7579.38 27582.11 00:37:39.043 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:39.043 Verification LBA range: start 0x1000 length 0x1000 00:37:39.043 crypto_ram2 : 5.05 3044.59 11.89 0.00 0.00 41772.84 7123.48 44678.46 00:37:39.043 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:37:39.043 Verification LBA range: start 0x0 length 0x1000 00:37:39.043 crypto_ram3 : 5.06 3745.69 14.63 0.00 0.00 33804.56 3932.16 27468.13 00:37:39.043 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:37:39.043 Verification LBA range: start 0x1000 length 0x1000 00:37:39.043 crypto_ram3 : 5.05 3043.38 11.89 0.00 0.00 41667.78 6553.60 35788.35 00:37:39.043 =================================================================================================================== 00:37:39.043 Total : 15326.54 59.87 0.00 0.00 66430.52 3932.16 200597.15 00:37:39.043 00:37:39.043 real 0m8.209s 00:37:39.043 user 0m15.574s 00:37:39.043 sys 0m0.362s 00:37:39.043 12:18:52 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:39.043 12:18:52 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:37:39.043 ************************************ 00:37:39.043 END TEST bdev_verify 00:37:39.043 ************************************ 00:37:39.043 12:18:52 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:37:39.043 12:18:52 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:37:39.043 12:18:52 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 16 -le 1 ']' 00:37:39.043 12:18:52 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:39.043 12:18:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:39.043 ************************************ 00:37:39.043 START TEST bdev_verify_big_io 00:37:39.043 ************************************ 00:37:39.043 12:18:52 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:37:39.302 [2024-07-15 12:18:52.654115] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:37:39.302 [2024-07-15 12:18:52.654176] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1674014 ] 00:37:39.302 [2024-07-15 12:18:52.783869] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:37:39.302 [2024-07-15 12:18:52.887204] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:37:39.302 [2024-07-15 12:18:52.887208] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:39.560 [2024-07-15 12:18:52.908597] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:37:39.560 [2024-07-15 12:18:52.916623] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:37:39.560 [2024-07-15 12:18:52.924642] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:37:39.560 [2024-07-15 12:18:53.032200] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:37:42.131 [2024-07-15 12:18:55.242761] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:37:42.131 [2024-07-15 12:18:55.242853] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:42.131 [2024-07-15 12:18:55.242868] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:42.131 [2024-07-15 12:18:55.250778] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:37:42.131 [2024-07-15 12:18:55.250800] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:42.131 [2024-07-15 12:18:55.250813] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:42.131 [2024-07-15 12:18:55.258799] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:37:42.131 [2024-07-15 12:18:55.258818] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:37:42.131 [2024-07-15 12:18:55.258830] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:42.131 [2024-07-15 12:18:55.266819] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:37:42.131 [2024-07-15 12:18:55.266837] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:37:42.131 [2024-07-15 12:18:55.266848] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:42.131 Running I/O for 5 seconds... 00:37:42.698 [2024-07-15 12:18:56.254778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.255322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.255413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.255473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.255525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.255577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.256115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.256136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.260529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.260594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.260646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.260705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.261317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.261389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.261444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.261496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.261973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.261995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.266274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.266343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.266395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.698 [2024-07-15 12:18:56.266446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.267047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.267103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.267154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.267205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.267678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.267704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.271890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.271946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.271999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.272051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.272619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.272674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.272734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.272785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.273301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.273321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.277555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.277647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.277715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.277767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.278349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.278413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.278466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.278518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.279045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.279066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.283444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.283501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.283553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.283604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.284186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.284241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.284293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.284346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.284814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.284835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.288941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.288999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.289050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.289113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.289693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.289748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.289799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.289850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.290386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.699 [2024-07-15 12:18:56.290407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.294501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.294559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.294611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.294663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.295356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.295419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.295472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.295524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.296027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.296048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.300174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.300241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.300292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.300344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.300866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.300940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.301005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.301057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.301616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.301637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.305778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.305835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.305887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.305938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.306506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.306560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.306612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.306665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.307213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.307235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.311182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.311249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.311315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.311391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.311989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.312044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.312102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.312155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.312748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.312770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.316182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.316239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.316291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.316341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.316845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.316903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.316954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.317005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.317414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.317434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.321010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.321067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.321118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.321169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.321720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.321775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.321825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.321876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.322254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.322274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.324983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.325037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.325088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.325138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.325719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.325773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.325824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.325883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.326412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.326433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.329653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.329712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.329763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.329814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.330346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.330399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.330450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.330501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.330905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.330925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.334364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.334421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.962 [2024-07-15 12:18:56.334473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.334526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.334945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.334999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.335058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.335110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.335453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.335473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.338078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.338134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.338191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.338243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.338783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.338838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.338890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.338947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.339471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.339492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.342656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.342724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.342777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.342829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.343213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.343268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.343326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.343383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.343731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.343752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.347092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.347149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.347204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.347256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.347639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.347707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.347760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.347815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.348154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.348174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.350726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.350789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.350842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.350894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.351476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.351533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.351585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.351637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.352185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.352208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.355182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.355246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.355299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.355352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.355789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.355844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.355902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.355957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.356301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.356321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.359716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.359772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.359823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.359876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.360259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.360321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.360374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.360428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.360775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.360795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.363282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.363337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.363389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.363441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.363996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.364050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.364101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.364153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.364711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.364733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.367811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.367868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.367918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.367976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.368481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.368535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.368585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.368635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.369050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.369071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.372204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.372261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.372313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.372366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.372901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.372956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.373007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.373057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.373430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.373450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.375928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.963 [2024-07-15 12:18:56.375988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.376040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.376091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.376579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.376658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.376719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.376770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.377319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.377351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.380378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.380434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.380514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.380565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.380970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.381027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.381079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.381135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.381484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.381505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.384486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.384542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.384593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.384646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.385221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.385277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.385330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.385396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.385747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.385769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.388211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.388271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.388321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.388371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.388769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.388824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.388875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.388926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.389449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.389475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.392634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.392722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.392774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.392824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.393210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.393264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.393315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.393366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.393843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.393863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.396497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.396553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.396607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.396658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.397216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.397275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.397327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.397379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.397925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.397947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.400288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.400351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.400409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.400449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.400836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.400891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.400941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.400992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.401332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.401352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.406547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.408573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.410587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.411989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.414431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.416459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.416965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.417457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.418015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.418037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.422172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.424041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.426099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.427654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.428817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.429315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.429812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.431585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.431938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.431959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.436097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.436593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.437090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.437586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.439760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.441783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.443768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.445293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.445671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.445697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.449080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.964 [2024-07-15 12:18:56.450933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.452884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.454913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.456990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.458834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.460901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.462464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.463040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.463062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.467886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.469325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.471257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.473264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.475233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.475736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.476226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.476722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.477083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.477103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.481370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.483400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.483904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.484400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.485504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.487283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.489304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.491327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.491866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.491887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.494810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.495313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.496977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.498786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.501114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.502666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.504556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.506605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.506962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.506983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.511859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.513898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.515483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.517471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.519807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.521553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.522054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.522550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.523111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.523134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.527220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.529240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.531256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.532024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.533086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.533582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.535382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.537410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.537765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.537786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.540480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.540988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.541486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.543191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.545615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.547611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.549141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.550903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.551249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:42.965 [2024-07-15 12:18:56.551269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.225 [2024-07-15 12:18:56.556162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.558158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.560172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.561624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.563794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.565874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.567369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.567886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.568373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.568394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.571944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.573731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.575766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.577784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.578810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.579304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.579920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.581699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.582047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.582068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.586243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.586754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.587261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.587762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.589931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.591966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.593953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.595546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.595933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.595954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.599118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.601028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.603015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.604889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.607100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.609054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.609922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.610417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.610952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.610974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.614539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.615045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.615546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.616049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.617073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.617578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.618081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.618574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.619156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.619180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.622935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.623434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.623958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.624459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.625555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.626065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.626562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.627066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.627601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.627622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.631103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.631608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.632110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.632602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.633653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.634160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.634655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.635154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.635708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.635731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.639226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.639735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.640249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.640747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.641764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.642265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.642765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.643257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.643800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.643822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.647392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.647898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.648392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.648897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.649915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.650435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.650938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.651435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.651921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.651943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.655498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.656008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.656509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.657007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.658024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.658523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.659023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.226 [2024-07-15 12:18:56.659518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.660032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.660055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.663701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.664206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.664709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.665202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.666190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.666698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.667190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.667690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.668196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.668217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.671807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.672307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.672808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.673299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.674272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.674797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.675290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.675790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.676325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.676346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.680126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.680626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.681125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.681618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.682631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.683140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.683633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.684135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.684602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.684623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.688311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.688817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.689312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.689810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.690831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.691328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.691830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.692329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.692882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.692905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.697412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.699300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.699828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.700323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.701735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.703501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.705514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.707548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.708078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.708099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.710866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.711368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.713092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.714937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.717268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.718885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.720646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.722638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.722997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.723020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.727788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.729823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.731672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.733532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.735936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.737928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.738424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.738919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.739465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.739488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.743265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.745150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.747203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.748712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.749739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.750238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.751998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.753909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.754258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.754278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.757720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.758238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.758736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.759273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.761607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.763624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.764903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.766682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.767035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.767056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.770146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.771957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.773978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.775997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.778435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.780458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.782467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.227 [2024-07-15 12:18:56.783606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.784204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.784225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.788774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.789975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.791751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.791800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.794175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.795119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.795620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.796116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.796653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.796673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.800456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.802461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.804481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.804989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.805047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.805591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.806103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.807316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.809084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.811109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.811457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.811477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.813571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.813628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.813680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.813753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.814355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.814419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.814471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.814523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.814574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.815135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.815156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.817374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.817429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.817480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.817550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.818038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.818104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.818156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.818206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.818257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.818624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.228 [2024-07-15 12:18:56.818644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.821251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.821307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.821359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.821411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.821960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.822026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.822085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.822138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.822189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.822566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.822586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.824726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.824781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.824833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.824883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.825224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.825294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.825355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.825407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.825458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.825866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.825888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.828789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.828856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.828912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.828966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.829309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.829384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.829438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.829489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.829540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.829884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.829905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.832023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.832080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.832131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.832183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.490 [2024-07-15 12:18:56.832742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.832806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.832859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.832911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.832963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.833475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.833496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.835718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.835773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.835823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.835873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.836324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.836394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.836447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.836498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.836548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.836934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.836955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.839432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.839487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.839544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.839595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.840130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.840195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.840247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.840299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.840359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.840705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.840727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.842829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.842884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.842934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.842985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.843324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.843399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.843452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.843512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.843565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.843923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.843944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.846894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.846949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.847000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.847057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.847402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.847474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.847533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.847589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.847640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.847988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.848009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.850110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.850165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.850222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.850273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.850865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.850929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.850982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.851034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.851086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.851593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.851614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.853926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.853980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.854036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.854087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.854562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.854633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.854691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.854743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.854794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.855168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.491 [2024-07-15 12:18:56.855188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.857646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.857715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.857768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.857820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.858394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.858467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.858520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.858572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.858623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.858969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.858990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.861078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.861135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.861186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.861237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.861576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.861645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.861703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.861754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.861812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.862149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.862169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.865331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.865386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.865437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.865488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.865855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.865924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.865984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.866040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.866090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.866431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.866451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.868520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.868576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.868635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.868702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.869220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.869283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.869336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.869388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.869440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.869966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.869988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.872426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.872504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.872555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.872606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.873000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.873075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.873153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.873207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.873258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.873631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.873651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.876129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.876186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.876238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.876305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.876919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.876984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.877038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.877092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.877144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.877537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.877557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.879689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.879746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.879800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.879851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.492 [2024-07-15 12:18:56.880193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.880262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.880314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.880364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.880415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.880759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.880780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.883818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.883873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.883928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.883979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.884381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.884450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.884502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.884553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.884606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.884952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.884973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.886993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.887048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.887099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.887150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.887619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.887699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.887752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.887803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.887859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.888398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.888419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.890945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.890999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.891050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.891101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.891438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.891509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.891563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.891615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.891667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.892014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.892034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.894302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.894358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.894410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.894463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.895033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.895098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.895151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.895203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.895255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.895768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.895789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.897850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.897905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.897960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.898011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.898392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.898460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.898525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.898585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.898637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.898980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.899000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.901853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.901909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.901962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.902016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.902364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.902445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.902502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.902552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.902602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.902947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.902967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.905032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.905087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.905159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.905210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.905582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.905651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.905712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.905771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.493 [2024-07-15 12:18:56.905824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.906403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.906426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.908909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.908968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.909019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.909073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.909413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.909482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.909534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.909585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.909635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.910090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.910123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.912339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.912396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.912453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.912506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.912998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.913062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.913114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.913165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.913216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.913766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.913789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.915863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.915918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.915968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.916019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.916393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.916467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.916523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.916574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.916624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.916966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.916987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.919759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.919826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.919877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.919929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.920268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.920340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.920397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.920453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.920504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.920848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.920868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.922933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.922988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.925005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.925069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.925559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.925622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.925678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.925738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.925790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.926331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.926353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.928753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.928831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.928885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.930360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.930711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.930785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.930839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.930895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.930945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.931288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.931312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.934330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.936117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.938137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.940152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.494 [2024-07-15 12:18:56.940660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.942492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.944508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.946527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.947475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.948045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.948067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.952535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.953938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.955797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.957817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.958158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.959629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.960142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.960634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.961135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.961477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.961497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.965520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.966030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.966524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.967039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.967508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.968022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.968515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.969018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.969511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.969977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.969998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.973536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.974044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.974535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.975030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.975471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.975985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.976498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.976995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.977490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.978027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.978049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.981553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.982067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.982560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.983057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.983503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.984020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.984532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.985029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.985524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.986046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.986067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.989578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.990085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.990576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.991074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.991553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.992074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.992570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.993068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.993562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.994056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.994077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.997540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.998047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.998541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.999053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:56.999556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:57.000078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:57.000574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:57.001068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:57.001560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:57.002084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:57.002106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.495 [2024-07-15 12:18:57.005526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.006044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.006539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.007040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.007513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.008029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.008525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.009021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.009515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.010036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.010057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.013379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.013890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.014402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.014917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.015415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.015931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.016426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.016922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.017414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.017930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.017951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.021341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.021857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.022353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.022864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.023363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.023880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.024376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.024871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.025358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.025885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.025907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.029320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.029845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.030342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.030843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.031383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.031904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.032399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.032895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.033386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.033840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.033862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.037251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.037779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.038273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.038771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.039244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.039761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.040257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.040754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.041251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.496 [2024-07-15 12:18:57.041733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.041754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.045197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.045712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.046210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.046709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.047191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.048963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.050371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.051116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.051607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.052164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.052186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.055532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.056478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.058250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.060269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.060613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.061841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.063619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.065652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.067679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.068170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.068191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.072829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.074854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.076051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.077841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.078186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.080223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.081006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.497 [2024-07-15 12:18:57.081499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.758 [2024-07-15 12:18:57.082001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.758 [2024-07-15 12:18:57.082534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.758 [2024-07-15 12:18:57.082556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.758 [2024-07-15 12:18:57.086322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.758 [2024-07-15 12:18:57.088335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.758 [2024-07-15 12:18:57.090356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.758 [2024-07-15 12:18:57.090861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.758 [2024-07-15 12:18:57.091383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.758 [2024-07-15 12:18:57.091892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.092736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.094525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.096546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.096897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.096917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.099468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.099972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.100481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.101724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.102070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.104058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.106085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.107346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.109131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.109474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.109494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.112533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.114513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.116549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.118570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.118985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.120986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.122972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.124987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.126212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.126804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.126826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.131372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.133363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.134967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.136708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.137049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.139100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.139599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.140096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.140588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.140984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.141005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.144820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.146814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.148488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.148994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.149520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.150034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.151467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.153243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.155281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.155626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.155647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.158137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.158634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.159135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.161114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.161458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.163448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.165195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.167081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.169071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.169414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.169435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.173357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.175146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.177167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.179156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.179567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.181349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.183374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.185357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.185858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.186397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.186418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.191061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.192275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.194053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.196055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.196398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.196981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.197477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.197990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.199202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.199581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.199601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.203643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.205676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.206186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.206678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.207251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.208313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.210100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.212127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.214154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.214616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.214637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.217296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.217810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.219138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.220891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.759 [2024-07-15 12:18:57.221234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.223272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.224589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.226369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.228384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.228734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.228760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.233497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.235505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.237507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.238699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.239117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.241138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.243151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.243810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.244304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.244816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.244837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.248168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.249935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.251960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.253975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.254447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.254963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.255457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.256225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.258003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.258348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.258368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.262409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.262960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.263456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.263959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.264457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.266212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.268218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.270246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.271455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.271866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.271887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.274825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.275724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.277500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.279495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.279845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.281070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.282853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.284875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.286886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.287416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.287436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.292159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.294185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.295377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.297160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.297503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.299547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.300661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.301162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.301651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.302206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.302227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.306027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.308054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.308112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.310137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.310660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.311181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.311677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.312433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.314202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.314546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.314566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.318605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.319187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.319681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.319757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.320351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.321325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.323103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.325100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.327107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.327584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.327605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.329791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.329847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.329899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.329954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.330471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.330546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.330602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.330653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.330710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.331245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.331266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.333308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.333366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.333425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.760 [2024-07-15 12:18:57.333476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.333914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.333988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.334040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.334091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.334142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.334480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.334500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.337240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.337295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.337348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.337399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.337773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.337839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.337890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.337946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.338010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.338351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.338371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.340430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.340485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.340535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.340586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.340929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.340999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.341058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.341112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.341164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.341683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.341710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.344358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.344413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.344463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.344514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.344856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.344934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.344992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.345043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.345094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.345504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.345525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.347590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.347646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.347704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.347756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.348308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.348371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.348424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.348475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.348529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.349098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.349120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.351206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.351269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.351321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:43.761 [2024-07-15 12:18:57.351372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.351716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.351784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.351843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.351899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.351959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.352299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.352320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.354938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.354994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.355050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.355102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.355655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.355725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.355783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.355834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.355884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.356270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.356290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.358411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.358466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.358517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.358567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.358910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.358993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.359051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.359102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.359153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.359581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.359602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.362601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.362665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.362724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.362780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.363120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.363190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.363247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.363299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.363349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.363692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.363712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.365814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.365879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.365931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.365985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.366493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.366566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.366620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.366671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.366729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.367231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.367253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.369625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.369695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.369753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.369804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.370146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.370212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.370271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.370323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.370379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.370726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.370746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.373524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.373581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.373633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.373690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.374035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.374111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.374171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.374222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.374272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.374608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.374628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.376717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.376771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.376822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.376872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.377208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.377278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.377349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.377408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.377460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.377948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.377970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.380817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.380872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.380937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.380989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.381563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.381627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.381680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.381740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.381792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.382353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.382375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.023 [2024-07-15 12:18:57.385308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.385368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.385441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.385506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.386086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.386152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.386207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.386260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.386311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.386788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.386810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.389701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.389760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.389812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.389882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.390357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.390436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.390510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.390595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.390650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.391134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.391156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.394033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.394089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.394141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.394192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.394731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.394795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.394848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.394900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.394953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.395460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.395485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.398408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.398466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.398518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.398570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.399117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.399191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.399248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.399299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.399351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.399849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.399870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.402816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.402874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.402930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.402982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.403496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.403569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.403635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.403705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.403777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.404235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.404256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.407122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.407190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.407242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.407294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.407850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.407921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.407974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.408038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.408090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.408615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.408636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.411576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.411634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.411691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.411744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.412287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.412362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.412418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.412470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.412522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.413041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.413062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.416001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.416060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.416112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.416163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.416603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.416677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.416748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.416801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.416876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.417393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.417413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.420321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.420389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.420449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.420500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.421046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.421116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.421169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.421221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.421273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.421753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.024 [2024-07-15 12:18:57.421774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.424648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.424710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.424763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.424815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.425347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.425427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.425482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.425533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.425586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.426110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.426131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.428986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.429043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.429096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.429148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.429697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.429773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.429849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.429902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.429953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.430390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.430410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.433435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.433492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.433549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.433613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.434158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.434233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.434290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.434341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.434392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.434925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.434946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.437904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.437961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.438017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.438071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.438597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.438672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.438731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.438782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.438833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.439369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.439390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.442375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.442431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.442484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.442535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.443083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.443151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.443217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.443288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.443357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.443838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.443858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.447022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.447091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.447143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.447195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.447622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.447702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.447774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.447826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.447878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.448451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.448472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.451247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.451302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.451802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.451861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.452404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.452479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.452534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.452588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.452640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.453133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.453155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.456084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.456140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.456191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.025 [2024-07-15 12:18:57.456697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.457180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.457255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.457309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.457360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.457410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.457987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.458009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.461598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.462101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.462595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.463117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.463691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.464199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.464700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.465214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.465717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.466259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.466283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.469705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.470206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.470710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.471208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.471766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.472273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.472770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.473264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.473768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.474334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.474359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.478402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.478911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.479408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.479906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.480459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.480973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.481473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.481968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.482788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.483178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.483199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.487284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.489307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.489813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.490308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.490874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.491537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.493323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.495341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.497362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.497821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.497842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.500521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.501030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.502238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.504032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.504379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.506426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.507670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.509432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.511455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.511808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.511829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.516609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.518633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.520644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.521848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.522227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.524266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.526292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.527134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.527626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.528158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.528180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.531600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.533382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.535401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.537421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.537965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.538476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.538976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.539719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.541488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.541882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.541905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.545936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.546441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.546942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.547440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.547905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.549694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.551717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.553740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.555038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.555432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.555453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.558397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.559641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.561418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.563419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.026 [2024-07-15 12:18:57.563772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.564998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.566765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.568762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.570739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.571268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.571288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.575939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.577948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.579149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.580923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.581268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.583322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.584208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.584703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.585198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.585737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.585758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.589602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.591631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.593647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.594154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.594722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.595232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.595957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.597736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.599756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.600102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.600126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.602807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.603310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.603815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.605026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.605406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.607434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.609440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.610644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.612428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.612780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.027 [2024-07-15 12:18:57.612801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.616148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.617920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.619938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.621948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.622414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.624201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.626215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.628227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.628898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.629459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.629481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.633949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.635189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.636970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.638994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.639350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.640554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.641061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.641561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.642166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.642549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.642569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.646706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.648735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.649233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.649735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.650277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.650867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.652632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.654642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.656642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.657181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.657201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.659728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.660232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.661103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.662883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.663227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.665273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.666476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.668256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.670272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.670616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.670636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.675622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.677647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.679653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.680858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.681256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.683309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.685330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.686326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.686825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.687305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.687326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.690920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.692796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.694818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.696842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.697320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.697844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.698342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.698840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.700749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.701094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.701115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.705189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.288 [2024-07-15 12:18:57.706073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.706569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.707071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.707624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.709636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.711658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.713677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.714880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.715252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.715273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.718056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.718560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.720411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.722436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.722787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.724205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.726074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.728092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.730119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.730571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.730592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.735259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.737324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.738866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.740850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.741196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.743229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.744829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.745325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.745829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.746373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.746394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.750052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.751922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.753931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.755358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.755936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.756445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.756946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.758653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.760484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.760838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.760859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.764304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.764830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.765327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.765829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.766177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.768173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.770190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.771465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.773224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.773572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.773592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.776675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.778646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.780636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.782665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.783069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.784902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.786872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.788897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.790340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.790880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.790902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.795363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.797321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.799067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.800939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.801290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.803289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.803796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.804293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.804792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.805196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.805217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.289 [2024-07-15 12:18:57.809052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.811098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.812676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.813174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.813702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.814211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.815712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.817491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.819489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.819841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.819862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.822434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.822941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.823437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.825169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.825516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.827518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.828756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.830770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.832788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.833253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.833274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.837999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.840037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.841057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.842897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.843242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.845283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.846629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.847132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.847634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.848201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.848223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.851482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.851996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.852496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.852998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.853600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.854122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.854622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.855126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.855621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.856179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.856201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.859626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.860134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.860650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.861152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.861657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.862169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.862673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.863200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.863704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.864223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.864249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.867649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.868159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.868221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.868721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.869329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.869845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.870346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.870847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.871345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.871908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.871930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.875200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.875707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.876208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.876280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.876768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.877277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.877778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.878277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.878785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.879281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.290 [2024-07-15 12:18:57.879302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.882461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.882523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.882575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.882638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.883167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.883243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.883297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.883349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.883400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.883932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.883955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.886895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.886957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.887009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.887062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.887591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.887674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.887749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.887801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.887852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.888408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.888429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.891388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.891445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.891497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.891550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.892079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.892143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.892208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.892263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.892328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.892854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.892875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.895276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.895332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.895383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.895435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.895978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.896042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.896095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.896160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.896212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.896694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.896721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.899243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.899300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.899351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.899405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.899947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.900022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.900079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.900144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.900198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.900729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.900751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.903240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.903298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.903350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.903401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.903923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.903986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.904039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.904090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.904160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.904659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.904679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.907276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.907337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.556 [2024-07-15 12:18:57.907388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.907440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.907959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.908025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.908082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.908138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.908203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.908679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.908707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.911188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.911252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.911320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.911371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.911927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.911993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.912047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.912098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.912151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.912602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.912623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.915177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.915248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.915312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.915363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.915926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.915996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.916049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.916102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.916154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.916660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.916682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.919159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.919216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.919282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.919335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.919928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.920000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.920052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.920105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.920157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.920635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.920656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.923220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.923277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.923348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.923401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.923965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.924032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.924086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.924138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.924190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.924680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.924707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.927249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.927306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.927361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.927419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.928013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.928085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.928139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.928191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.928244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.928716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.928738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.931344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.931401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.931452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.931515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.932101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.932178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.932233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.932285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.932337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.932803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.932825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.935489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.935545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.935596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.935648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.936208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.936283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.936337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.936389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.936443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.557 [2024-07-15 12:18:57.936938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.936960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.939782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.939837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.939888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.939939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.940489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.940562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.940621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.940674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.940733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.941218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.941238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.944012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.944068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.944123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.944175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.944657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.944735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.944791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.944842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.944894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.945395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.945416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.948296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.948353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.948404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.948455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.948921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.948998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.949054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.949106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.949158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.949656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.949677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.951919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.951975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.952027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.952077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.952617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.952680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.952740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.952795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.952849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.953201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.953221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.956137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.956196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.956248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.956300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.956790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.956865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.956919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.956972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.957024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.957529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.957550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.960600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.960661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.960724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.960778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.961144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.961211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.961263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.961321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.961395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.961749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.961770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.963886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.963943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.963994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.964054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.964393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.964469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.964528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.964582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.964633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.965194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.965216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.967843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.967899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.967950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.968017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.968358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.558 [2024-07-15 12:18:57.968428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.968481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.968531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.968582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.969049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.969070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.971234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.971291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.971341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.971394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.971923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.971986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.972056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.972122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.972174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.972724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.972747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.974785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.974842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.974899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.974969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.975322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.975396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.975454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.975504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.975555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.975905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.975927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.978592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.978652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.978713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.978767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.979206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.979277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.979329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.979380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.979431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.979809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.979830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.981827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.981885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.981936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.981987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.982324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.982394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.982446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.982497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.982556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.983027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.983049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.985786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.985847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.985898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.985950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.986287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.986357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.986414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.986473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.986530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.986883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.986905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.988944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.989008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.989075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.989129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.989694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.989758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.989811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.989863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.989917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.990415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.990435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.992711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.992768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.992818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.992876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.993364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.993435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.993490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.993542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.993595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.993946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.993971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.996493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.996561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.996612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.996663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.997178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.997244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.997301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.559 [2024-07-15 12:18:57.997353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:57.997405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:57.997752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:57.997772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:57.999922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:57.999978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.001999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.002057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.002396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.002467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.002520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.002578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.002630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.003138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.003160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.005889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.005945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.005996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.008006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.008351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.008426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.008479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.008531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.008588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.008934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.008957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.011756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.012257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.014239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.016226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.016570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.018342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.020295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.022286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.024307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.024712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.024734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.029457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.031496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.033521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.035019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.035404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.037449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.039467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.039975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.040470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.041016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.041039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.044381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.046157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.048186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.050176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.050646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.051169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.051668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.052720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.054482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.054835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.054855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.058934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.059444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.059949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.060446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.060863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.062406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.064439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.066499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.067942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.068370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.068390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.071400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.072617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.074395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.076413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.076762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.077988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.079762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.081781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.083771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.084270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.560 [2024-07-15 12:18:58.084291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.089035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.091055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.092269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.094051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.094396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.096429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.097426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.097927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.098416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.098956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.098977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.102812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.104820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.106832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.107334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.107859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.108362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.109191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.110975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.112994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.113338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.113358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.115978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.116483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.116984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.118615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.119004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.121043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.123038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.124608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.126409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.126761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.126782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.130906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.132694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.134712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.136692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.137087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.138876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.140902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.142893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.143389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.143915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.561 [2024-07-15 12:18:58.143937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.148509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.149857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.151622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.153627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.153979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.154490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.154996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.155487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.157069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.157451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.157472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.161534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.162931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.163437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.163937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.164480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.166405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.168334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.170356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.171572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.171940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.171962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.174917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.175550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.177324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.179353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.179703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.180924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.182699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.184712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.186739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.187194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.187215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.191993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.194006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.195213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.196985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.197327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.199359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.200572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.201068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.201561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.202090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.202113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.205964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.207982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.210003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.210646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.211168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.211674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.212248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.214021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.216031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.216378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.216398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.219051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.219551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.220051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.221497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.221919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.223962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.225985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.227364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.229126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.229470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.229490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.233417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.235218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.237215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.239200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.239624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.241417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.243424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.245402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.245908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.246430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.246452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.850 [2024-07-15 12:18:58.251079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.252339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.253224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.255227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.255576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.257125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.257630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.258127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.258621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.258970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.258991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.262784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.264733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.265231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.266625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.267118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.267626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.269396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.271401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.273423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.273870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.273890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.276488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.276994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.278044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.279823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.280166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.282211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.283066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.284843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.286904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.287248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.287268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.290726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.291237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.291744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.292236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.292724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.293234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.293735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.294228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.294732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.295229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.295249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.298850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.299354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.299855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.300347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.300808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.301319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.301834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.302326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.302828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.303339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.303359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.306981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.307484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.307983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.308478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.308914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.309423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.309947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.310440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.310941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.311442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.311468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.315060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.315565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.316064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.316555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.317043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.317556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.318058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.318547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.319045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.319517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.319538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.323144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.323655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.324153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.324646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.325107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.325617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.326138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.326630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.327139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.327672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.327699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.331317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.331824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.332318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.332813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.333270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.333785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.334285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.334781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.335283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.335763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.335784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.339397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.851 [2024-07-15 12:18:58.339903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.340395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.340890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.341385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.341903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.342400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.342898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.343393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.343879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.343901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.347665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.348173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.348665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.349158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.349693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.350204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.350704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.351196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.351697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.352162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.352183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.355804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.356309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.356807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.357297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.357752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.358269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.358775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.359267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.359772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.360233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.360255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.363951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.364457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.364953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.365444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.365921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.366440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.366939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.367430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.367932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.368278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.368298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.371662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.372168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.372693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.373188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.373775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.374287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.374793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.375286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.375784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.376313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.376334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.380259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.382287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.382346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.384370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.384873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.385382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.385880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.386642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.388420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.388770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.388791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.392914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.393421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.393922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.393977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.394524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.395801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.397568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.399592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.401582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.401990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.402010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.404403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.404460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.404513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.404565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.405061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.405123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.405176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.405229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.405281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.405811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.405831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.407957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.408013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.408064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.408118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.408462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.408532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.408600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.408655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.408711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.409052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.409072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.412093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.412149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.852 [2024-07-15 12:18:58.412200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.412251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.412632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.412706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.412759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.412810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.412861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.413196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.413215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.415302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.415357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.415407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.415458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.415941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.416011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.416063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.416114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.416166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.416723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.416745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.419425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.419481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.419547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.419598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.419966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.420040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.420094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.420156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.420211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.420595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.420615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.423166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.423223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.423275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.423344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.423939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.424002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.424056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.424107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.424159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.424554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.424574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.426743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.426820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.426871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.426922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.427264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.427333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.427385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.427440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.427491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.427833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.427853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.430997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.431058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.431113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.431164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.431559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.431629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.431681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.431738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.431790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.432127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.432147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.434275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.434331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.434381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.434432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.434964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.435039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.435097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.435149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.435199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.435748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:44.853 [2024-07-15 12:18:58.435769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.438394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.438452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.438510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.438565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.438914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.438984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.439038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.439089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.439144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.439536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.439558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.442101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.442158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.442209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.442276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.442881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.442944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.442997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.443049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.443100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.443498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.443518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.445673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.445749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.127 [2024-07-15 12:18:58.445804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.445855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.446191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.446274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.446328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.446380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.446430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.446786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.446808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.449973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.450029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.450084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.450145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.450483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.450553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.450630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.450693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.450745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.451081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.451100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.453272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.453327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.453385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.453437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.454022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.454088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.454142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.454193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.454245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.454733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.454754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.457070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.457125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.457176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.457227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.457697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.457768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.457820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.457871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.457922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.458291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.458316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.460884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.460941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.460996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.461047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.461576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.461638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.461698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.461751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.461817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.462157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.462178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.468116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.468180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.468240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.468292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.468826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.468895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.468949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.469000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.469051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.469545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.469565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.474308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.474372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.474423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.474475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.474818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.474889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.474941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.474999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.475064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.475427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.475448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.480300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.480362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.480414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.480464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.480952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.481022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.481074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.481125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.481176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.481594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.481614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.486803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.486863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.486926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.486978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.487320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.487400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.487454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.487505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.487555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.487897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.487917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.493819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.493882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.128 [2024-07-15 12:18:58.493935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.493986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.494547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.494620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.494673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.494733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.494785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.495125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.495145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.501106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.501167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.501219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.501271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.501778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.501845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.501897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.501949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.502001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.502550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.502573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.507341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.507408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.507465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.507516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.507859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.507930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.507982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.508032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.508083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.508420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.508440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.512653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.512728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.512788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.512844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.513265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.513334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.513387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.513443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.513493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.513909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.513930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.519113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.519173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.519223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.519274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.519646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.519726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.519779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.519830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.519894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.520233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.520254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.526262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.526324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.526376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.526427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.526951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.527016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.527070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.527122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.527175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.527658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.527678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.532904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.532975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.533026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.533077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.533564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.533632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.533692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.533744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.533796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.534347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.534368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.539179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.539240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.539295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.539346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.539735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.539804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.539859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.539919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.539974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.540314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.540334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.544660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.544725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.544777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.544828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.545161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.545235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.545296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.545349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.545401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.545764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.545785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.550701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.550762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.550819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.129 [2024-07-15 12:18:58.550869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.551265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.551336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.551388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.551439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.551489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.551832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.551852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.558344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.558407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.558907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.558961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.559479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.559544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.559600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.559653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.559715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.560091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.560112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.562313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.562369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.562420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.564444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.564794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.564865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.564919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.564976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.565027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.565556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.565577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.570190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.571690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.573682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.575709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.576051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.577680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.578180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.578669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.579164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.579505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.579526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.583729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.585738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.586300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.586795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.587330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.588101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.589874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.591873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.593892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.594343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.594364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.597093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.597592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.599062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.600831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.601178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.603229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.604712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.606477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.608495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.608844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.608865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.613877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.615902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.616771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.618785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.619125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.620137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.620636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.621133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.622106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.622479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.622500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.626601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.628621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.629127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.629620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.630213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.631362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.633155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.635203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.637186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.637702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.637723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.640434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.640939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.641439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.641938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.642516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.643025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.643519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.644024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.644523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.645063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.645088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.648746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.130 [2024-07-15 12:18:58.649245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.649743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.650243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.650777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.651282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.651786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.652285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.652785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.653298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.653318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.656768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.657276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.657789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.658284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.658831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.659333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.659834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.660329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.660848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.661353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.661379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.664904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.665405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.665910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.666405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.666923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.667425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.667926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.668426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.668923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.669468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.669488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.672951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.673451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.673951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.674448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.674980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.675482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.675982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.676481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.676980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.677536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.677557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.680990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.681500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.681999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.682494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.683047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.683552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.684057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.684555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.685058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.685641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.685663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.689225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.689733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.690238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.690737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.691269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.691780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.692281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.692778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.693275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.693857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.693879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.697235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.697744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.698245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.698744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.699338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.699847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.700345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.700852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.701344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.701909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.701931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.705263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.705775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.706270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.706767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.707323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.707837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.708337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.708833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.709325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.709876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.709897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.131 [2024-07-15 12:18:58.713310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.132 [2024-07-15 12:18:58.717290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.132 [2024-07-15 12:18:58.719095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.760941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.761031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.762681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.762743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.764380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.764653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.768290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.770226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.770620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.771018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.771506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.771877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.771929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.772288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.772342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.773411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.774732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.775006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.775022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.775037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.775051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.781731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.783469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.784018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.784407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.785228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.393 [2024-07-15 12:18:58.785622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.786042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.787899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.788185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.788201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.788215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.788230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.791663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.792993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.794667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.796344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.797012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.797403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.797797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.798203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.798639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.798656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.798672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.798693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.804034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.805376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.807087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.808991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.810294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.810691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.811082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.811469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.811936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.811954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.811970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.811984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.815609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.817531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.818151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.819793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.821742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.823411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.824890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.825279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.825734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.825751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.825766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.825784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.831212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.832882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.833811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.835572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.837594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.839373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.841289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.841689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.842173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.842191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.842207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.842223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.846297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.848120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.849794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.851459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.853015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.854326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.856004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.857822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.858094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.858111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.858125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.858139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.863681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.865074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.866725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.868390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.869348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.870928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.872833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.874528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.874805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.874821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.874835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.874850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.877467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.877864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.879327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.880641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.882693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.884363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.885039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.886945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.887302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.887324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.887338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.887352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.891848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.892245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.893100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.894524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.896442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.898116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.394 [2024-07-15 12:18:58.899440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.900789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.901124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.901141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.901155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.901169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.903033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.903426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.903819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.904208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.905778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.907107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.908785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.910648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.910926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.910942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.910956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.910970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.915481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.915903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.916294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.916692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.917497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.919385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.920835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.922501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.922782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.922798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.922812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.922827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.926167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.927847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.929178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.930294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.930961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.931351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.932722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.933638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.934098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.934115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.934130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.934146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.939546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.941401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.943069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.944747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.945477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.945873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.946260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.946649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.947094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.947112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.947132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.947147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.949382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.951205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.952506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.954173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.956363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.956763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.957152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.957541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.957962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.957979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.957994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.958008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.961455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.961857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.962252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.962647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.963375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.963773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.964161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.964549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.964991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.965009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.965026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.965041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.967735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.968127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.968520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.968919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.969803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.970195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.970587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.970980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.971468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.971487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.971503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.971518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.974925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.975321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.975719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.976114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.976956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.977347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.977740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.395 [2024-07-15 12:18:58.978128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.978527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.978544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.978558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.978572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.981377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.981777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.982169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.982563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.983312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.983708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.984096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.984483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.984926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.984944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.984959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.396 [2024-07-15 12:18:58.984979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.988368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.988772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.989187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.989584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.990483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.990889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.991277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.991669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.992101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.992121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.992136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.992150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.994792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.995185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.995580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.995980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.996803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.997195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.997586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.997980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.998377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.998393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.998408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:58.998422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.001870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.002275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.002669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.003065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.003886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.004277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.004670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.005066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.005460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.005476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.005490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.005505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.008323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.008725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.009122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.009519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.010423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.010819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.011208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.011598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.011986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.012003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.012017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.012032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.015715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.016117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.016509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.016903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.017720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.018112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.018503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.018903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.019381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.019397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.019411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.019425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.022426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.022838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.022888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.023284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.024182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.024573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.024618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.025003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.025398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.025800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.025818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.025833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.025847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.025862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.029557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.029962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.030347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.030740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.031128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.031620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.032025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.032073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.032465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.032513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.032917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.032933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.660 [2024-07-15 12:18:59.032948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.032963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.032978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.035355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.035753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.035808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.036197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.036583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.036996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.037045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.037431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.037474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.037942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.037960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.037975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.037990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.038005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.041192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.041590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.041634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.042036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.042499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.042908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.042966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.043361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.043430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.043904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.043920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.043935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.043950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.043964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.046573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.046976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.047034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.047425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.047871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.048290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.048340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.048733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.048777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.049233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.049249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.049264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.049279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.049293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.052511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.052911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.052955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.053341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.053763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.054168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.054223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.054612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.054677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.055200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.055217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.055232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.055248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.055263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.057756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.058157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.058212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.058603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.059067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.059466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.059513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.059910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.059953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.060421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.060438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.060458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.060473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.060488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.063259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.063655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.063705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.063746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.064225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.064624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.064668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.661 [2024-07-15 12:18:59.065741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.065789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.066059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.066075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.066090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.066104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.066118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.067974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.068036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.068077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.068118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.068610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.069013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.069059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.069100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.069142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.069637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.069654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.069669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.069683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.069706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.073587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.073636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.073676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.073722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.074648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.076140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.076185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.076232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.076276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.076678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.076734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.076775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.076815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.076855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.077300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.077316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.077335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.077354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.077369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.081635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.081693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.081734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.081775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.082548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.084194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.084240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.084284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.084325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.084590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.084649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.084697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.084741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.084782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.085178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.085196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.085214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.085229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.085244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.088260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.088320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.088360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.088400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.088666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.088734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.088780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.088820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.088860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.089127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.089142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.089156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.089171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.662 [2024-07-15 12:18:59.089184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.090885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.090930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.090969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.091788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.095940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.096876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.098509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.098553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.098596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.098636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.098998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.099058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.099099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.099142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.099182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.099449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.099465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.099479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.099494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.099508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.103830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.103878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.103918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.103961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.104951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.106482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.106525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.106565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.106604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.106937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.106996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.107038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.107079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.107123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.107393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.107409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.107423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.107437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.107451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.112694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.112747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.112788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.112829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.663 [2024-07-15 12:18:59.113985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.115796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.115839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.115882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.115922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.116703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.121094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.121143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.121184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.121224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.121670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.121727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.121770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.121816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.121858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.122287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.122305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.122324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.122339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.122354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.124479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.124524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.124588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.124629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.124900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.124949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.124990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.125033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.125083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.125350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.125366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.125381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.125395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.125408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.130077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.130134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.130178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.130218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.130484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.130532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.130573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.130625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.130666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.131104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.131120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.131135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.131149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.131167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.133889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.133935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.133981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.134806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.664 [2024-07-15 12:18:59.138889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.138939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.138979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.139803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.142080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.142125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.142170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.142210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.142648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.142708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.142751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.142793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.142833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.143152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.143168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.143182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.143197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.143210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.147837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.147886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.147926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.147966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.148802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.150762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.150806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.150851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.150891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.151337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.151410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.151452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.151497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.151537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.152036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.152058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.152073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.152088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.152102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.155700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.155749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.155798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.155840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.156730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.158213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.158258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.158298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.158339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.158750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.158801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.158842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.158888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.158931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.159370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.159387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.159402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.665 [2024-07-15 12:18:59.159417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.159432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.163542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.163592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.163632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.163672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.163941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.163999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.164040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.164079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.164119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.164424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.164441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.164455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.164470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.164484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.166033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.167716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.167763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.167803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.168212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.168271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.168662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.168711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.168753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.169186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.169207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.169222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.169237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.169252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.173027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.173076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.173116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.173155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.173426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.173512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.174584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.174655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.176403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.176775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.176791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.176806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.176819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.176833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.178793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.178844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.179229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.179272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.179691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.179746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.180134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.180177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.181075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.181350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.181366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.181381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.181405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.181419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.188075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.188135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.190047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.190102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.190371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.190425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.191091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.191137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.191524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.191957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.191975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.191990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.192006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.192021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.197465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.197521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.198761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.198807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.199085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.199140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.200454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.200500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.202167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.202441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.202458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.202472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.202486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.202500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.207799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.207859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.209322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.209368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.209640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.209698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.666 [2024-07-15 12:18:59.211101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.211150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.211941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.212252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.212270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.212284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.212298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.212311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.216668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.216729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.217117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.217160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.217595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.217646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.219573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.219624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.221492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.221775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.221792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.221807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.221821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.221835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.228405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.228461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.228909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.229309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.229761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.229818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.230205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.230247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.230633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.231043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.231060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.231074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.231088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.231102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.237201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.239131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.241058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.242806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.243199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.243262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.243668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.244063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.244451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.244915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.244933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.244949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.244964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.244979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.667 [2024-07-15 12:18:59.249454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.251131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.252947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.254612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.254897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.256343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.256744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.257132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.257519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.258022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.258040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.258055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.258070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.258084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.262588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.264370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.265702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.267369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.267645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.269578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.269979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.270367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.270761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.271219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.271236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.271251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.271265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.271279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.276529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.277901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.279217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.280892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.281166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.282826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.283565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.283977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.284371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.284761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.284778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.284793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.284807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.284822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.290161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.290951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.292435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.294353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.294630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.296329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.297553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.297950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.298338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.298761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.298780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.298794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.298808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.298822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.303902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.304466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.306389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.307967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.308242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.309937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.311597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.311993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.312383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.312823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.312840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.312862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.312877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.312891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.318042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.319006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.320749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.322070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.322347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.324242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.326163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.326555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.326949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.327406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.327423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.327439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.327454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.327469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.332713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.333956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.335405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.336725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.337000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.338868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.340501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.341149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.341542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.341996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.342014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.342031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.342046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.928 [2024-07-15 12:18:59.342061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.347683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.349245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.350397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.351720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.351994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.353667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.355346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.356262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.356652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.357109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.357127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.357142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.357156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.357171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.363120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.364945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.365762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.367220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.367496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.369184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.370862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.372097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.372487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.372932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.372950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.372965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.372983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.372997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.379177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.381114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.381779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.383384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.383661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.385345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.387025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.388388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.388784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.389230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.389250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.389265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.389280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.389295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.395415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.397284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.397766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.399568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.399852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.401533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.403188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.404751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.405141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.405599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.405616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.405632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.405647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.405664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.412070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.413981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.414721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.416340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.416617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.418300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.419979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.421344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.421739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.422177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.422194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.422210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.422225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.422240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.428361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.430345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.431868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.433183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.433459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.435395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.437177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.437671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.438066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.438510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.438528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.438544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.438559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.438574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.444233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.445517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.446838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.448498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.448781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.450455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.451252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.451663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.452059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.452431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.452448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.452463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.452477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.452492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.456856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.929 [2024-07-15 12:18:59.457263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.457652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.458047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.458485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.458888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.459282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.459691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.460086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.460537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.460554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.460569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.460583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.460597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.463918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.464319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.464716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.465105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.465543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.465951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.466352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.466754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.467142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.467573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.467595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.467610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.467625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.467640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.471203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.471601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.471997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.472389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.472833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.473237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.473649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.474044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.474432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.474877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.474895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.474910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.474927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.474942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.478386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.478790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.479189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.479584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.480045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.480446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.480840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.481227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.481615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.481964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.481986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.482000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.482019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.482034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.485798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.486200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.486598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.486993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.487429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.487835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.488224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.488616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.489030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.489413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.489431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.489445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.489460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.489474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.492990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.493393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.493790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.494178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.494645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.495049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.495445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.495847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.496239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.496683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.496710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.496726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.496741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.496756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.500374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.500782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.501191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.501581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.501999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.502406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.502812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.503202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.503592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.504004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.504022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.504036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.504051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.504065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.509657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.511206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.930 [2024-07-15 12:18:59.511597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.511992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.512422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.512827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.513216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.513607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.514013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.514478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.514495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.514509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.514523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.514538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.517937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.518338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.518734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.519123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.519595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.520003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:45.931 [2024-07-15 12:18:59.520401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.520802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.521195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.521629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.521646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.521661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.521676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.521699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.525271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.525326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.525719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.526112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.526559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.526968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.527020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.527409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.527812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.528280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.528299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.528317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.528331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.528346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.531950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.532348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.532745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.533134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.533535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.533955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.534012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.534407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.534456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.534958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.534978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.534993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.535008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.535025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.538241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.538640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.538691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.539080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.539512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.539918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.539964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.540357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.540414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.540874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.540892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.540906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.540921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.540935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.544389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.544807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.544865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.545279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.545774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.546175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.546221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.546608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.546651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.547116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.547134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.547150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.547165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.547180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.550283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.550680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.550732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.551125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.551618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.552032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.552099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.552489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.552534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.552989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.553007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.553022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.553040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.553054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.556178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.556576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.556621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.557017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.195 [2024-07-15 12:18:59.557451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.557858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.557912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.558312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.558360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.558761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.558779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.558798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.558812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.558826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.561834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.562756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.562804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.563200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.563601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.564010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.564058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.564445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.564488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.564968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.564986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.565001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.565016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.565030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.568030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.568426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.568473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.568514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.568965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.570732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.570779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.572154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.572199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.572469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.572485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.572500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.572514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.572527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.577717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.577767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.577807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.577847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.578285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.579937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.579985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.580025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.580076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.580567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.580584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.580599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.580616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.580631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.583220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.583269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.583309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.583359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.583652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.583711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.583753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.583793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.583832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.584102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.584117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.584131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.584145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.584159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.588827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.588876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.588920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.588963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.589911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.592766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.592834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.592875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.592915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.593701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.598616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.598666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.598718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.598759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.599552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.604950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.605000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.605050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.605094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.605363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.196 [2024-07-15 12:18:59.605413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.605454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.605493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.605540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.605817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.605834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.605848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.605862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.605876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.610954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.614596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.614656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.614702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.614743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.615576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.620877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.620927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.620971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.621808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.625464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.625514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.625556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.625598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.626533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.631267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.631317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.631367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.631408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.631677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.631737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.631779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.631820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.631860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.632131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.632147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.632161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.632175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.632192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.636501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.636551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.636593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.636635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.637577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.642438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.642487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.642531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.642571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.642843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.642909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.642950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.642995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.643036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.643303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.643319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.643333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.643348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.643361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.647510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.647563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.647604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.647643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.648011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.648068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.197 [2024-07-15 12:18:59.648109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.648150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.648191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.648637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.648657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.648672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.648693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.648711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.653296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.653347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.653387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.653437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.653714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.653766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.653806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.653846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.653894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.654203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.654219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.654233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.654247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.654261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.658420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.658477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.658518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.658558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.659729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.664828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.664878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.664922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.664963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.665791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.671733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.671784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.671842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.671886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.672897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.676537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.676586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.676626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.676665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.677652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.682762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.682812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.682853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.682896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.683985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.688421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.688471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.688516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.688557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.688833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.688888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.688929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.688972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.198 [2024-07-15 12:18:59.689011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.689334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.689350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.689364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.689378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.689392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.693631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.693681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.693729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.693770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.694734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.698297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.698347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.698388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.698429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.698706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.698765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.698813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.698853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.698893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.699324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.699341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.699356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.699370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.699384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.704416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.704466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.704507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.704548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.704995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.705047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.705091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.705133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.705174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.705542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.705559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.705574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.705589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.705603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.709295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.710847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.710894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.710934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.711244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.711298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.712611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.712657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.712703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.712973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.712989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.713004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.713018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.199 [2024-07-15 12:18:59.713031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.719009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.719071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.719113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.719170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.719440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.719497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.720293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.720338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.720728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.721101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.721117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.721131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.721145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.721159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.727113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.727168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.728837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.728886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.729156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.729206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.731084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.731131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.731525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.732038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.732055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.732071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.732087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.732103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.737555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.737610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.739261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.739305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.739694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.739756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.741666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.741717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.743282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.743554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.743570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.743584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.743598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.743612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.750461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.750516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.751130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.751174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.751609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.751670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.752072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.752129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.754027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.754474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.754491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.754505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.754520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.754534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.758636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.758695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.760073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.760118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.760394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.760454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.762122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.762169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.763835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.764105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.764122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.764136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.764150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.764164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.769017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.769073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.770256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.770301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.770745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.770797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.771184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.771231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.773134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.773641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.773657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.773671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.773692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.773707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.778647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.778705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.780418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.781726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.781997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.782058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.783782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.200 [2024-07-15 12:18:59.783827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.785585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.786004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.786023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.786039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.786054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.786068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.790789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.792565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.794227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.794944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.795216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.795274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.796575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.798244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.799909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.800183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.800203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.800217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.800231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.800245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.803391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.803793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.805555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.806874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.807145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.809037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.810954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.811555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.813221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.813492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.813508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.813522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.813536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.813550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.818048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.818447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.819816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.821105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.821376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.823092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.824763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.825516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.827432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.827758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.827775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.827789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.827803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.827820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.831963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.832696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.834257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.834646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.835085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.836441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.837765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.839427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.841127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.841399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.841415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.841429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.841444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.841458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.845640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.846048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.846439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.846833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.847294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.847936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.849583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.851463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.853120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.853393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.853411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.853425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.853439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.853453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.858829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.860326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.861112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.861502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.861927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.863851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.864248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.462 [2024-07-15 12:18:59.864637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.866014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.866338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.866355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.866369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.866383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.866397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.872190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.874058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.875983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.876377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.876870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.877273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.877661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.878057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.878869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.879141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.879158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.879173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.879188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.879203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.885868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.887543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.889192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.890034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.890312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.891486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.891881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.892272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.894174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.894660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.894677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.894698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.894713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.894731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.899813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.901306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.902628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.904301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.904575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.906282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.906857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.907247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.907635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.908050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.908067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.908082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.908097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.908112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.913468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.914462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.915776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.917623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.917901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.919581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.920670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.922176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.922975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.923408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.923425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.923441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.923456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.923471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.928235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.930171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.931984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.932489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.932765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.934081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.935748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.937418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.939091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.939483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.939499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.939514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.939528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.939542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.944202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.945925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.947593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.948361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.948635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.949959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.951626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.953362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.955279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.955770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.955787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.955801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.955815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.955829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.959431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.959836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.960226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.962141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.962457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.964145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.965826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.463 [2024-07-15 12:18:59.967616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.968492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.968769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.968785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.968800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.968815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.968829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.970674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.971075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.971463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.971870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.972314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.973207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.974599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.976513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.978272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.978650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.978667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.978681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.978705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.978719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.981985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.982478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.984384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.984781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.985221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.986009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.987512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.987906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.988295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.988565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.988581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.988595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.988609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.988623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.992232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.993724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.995395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.997054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.997327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.997740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.998129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.998518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.998907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.999355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.999372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.999388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.999405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:18:59.999420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.002420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.002825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.003215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.004982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.005487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.005990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.006485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.006982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.007575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.008315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.008335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.008352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.008367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.008383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.011038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.011435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.012036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.013665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.014032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.014435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.014828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.015972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.017390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.017891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.017908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.017923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.017937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.017952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.020422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.020819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.021207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.021602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.022069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.022470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.022868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.023263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.023657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.024051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.024067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.024082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.024096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.024110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.027432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.028652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.029046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.029436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.029835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.030240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.030634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.031032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.464 [2024-07-15 12:19:00.031420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.031831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.031849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.031864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.031880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.031894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.034459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.034872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.035705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.037355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.037737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.038148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.038540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.039803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.041098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.041599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.041615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.041631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.041646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.041662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.044169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.044562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.044956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.045345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.045852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.046256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.046647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.047049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.047448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.047801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.047818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.047833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.047847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.047861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.051133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.052438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.052833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.053222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.053584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.054000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.465 [2024-07-15 12:19:00.054396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.054794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.055190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.055602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.055619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.055635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.055650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.055665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.058290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.058695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.059239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.060882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.061231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.061636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.062033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.063103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.064613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.065079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.065097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.065111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.065126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.065140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.067609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.067660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.068055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.068446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.068835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.069236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.069282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.069670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.070073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.070472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.070490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.070510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.070524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.070538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.072771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.073163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.074423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.075759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.076240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.076639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.076690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.077082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.077141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.077525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.077542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.077557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.077572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.077587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.079872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.080265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.080308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.080700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.081099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.081500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.081556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.082432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.082482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.082759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.082776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.082790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.082804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.082822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.084754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.085960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.086007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.087318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.087764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.088167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.729 [2024-07-15 12:19:00.088212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.088598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.088654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.088932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.088949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.088963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.088977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.088991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.091120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.092538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.092580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.093442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.093721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.094915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.094962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.096377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.096420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.096748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.096768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.096783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.096797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.096811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.098398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.099775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.099834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.100861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.101182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.101582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.101627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.102025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.102070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.102470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.102488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.102503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.102518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.102534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.105259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.105945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.106125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.107107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.107567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.108108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.108213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.108883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.108964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.109250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.109270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.109285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.109301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.109316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.111850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.112251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.112299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.112341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.112793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.113196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.113243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.113628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.113674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.114128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.114148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.114165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.114181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.114195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.116097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.116145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.116189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.116230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.116660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.117070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.117122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.117164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.117204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.117479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.117497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.117511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.117527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.117541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.119991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.120038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.120081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.120123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.120536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.120591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.120633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.120679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.120728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.121163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.121183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.121199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.121216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.121231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.123488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.123535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.123590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.123632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.124061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.124130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.730 [2024-07-15 12:19:00.124174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.124224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.124269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.124543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.124561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.124576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.124591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.124605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.126778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.126826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.126867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.126909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.127354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.127408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.127451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.127494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.127541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.127951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.127971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.127986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.128002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.128016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.130512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.130559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.130606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.130649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.131842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.133845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.133912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.133953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.133995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.134474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.134529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.134572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.134615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.134657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.135112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.135131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.135146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.135166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.135181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.137325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.137384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.137425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.137467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.137850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.137918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.137972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.138015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.138059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.138429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.138447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.138462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.138478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.138493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.141127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.141173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.141230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.141272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.141653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.141726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.141770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.141836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.141878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.142289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.142307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.142323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.142339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.142354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.144446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.144497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.144542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.144584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.144857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.144918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.144960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.145006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.145047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.145503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.145522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.145538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.145555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.145571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.147801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.147847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.147889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.147932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.148362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.148419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.148464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.731 [2024-07-15 12:19:00.148506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.148548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.148957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.148976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.148992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.149008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.149023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.151433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.151497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.151544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.151590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.151862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.151917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.151959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.152008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.152051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.152552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.152571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.152586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.152603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.152617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.154792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.154839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.154881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.154924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.155321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.155390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.155456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.155513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.155557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.156007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.156026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.156041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.156055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.156068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.158521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.158568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.158616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.158656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.159722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.161741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.161789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.161830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.161873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.162837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.165379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.165438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.165479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.165521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.165892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.165947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.165990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.166031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.166076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.166525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.166544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.166559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.166576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.166592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.168871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.168916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.168956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.168997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.169960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.172074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.172129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.172175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.172216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.172483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.172536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.732 [2024-07-15 12:19:00.172581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.172633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.172675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.173174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.173192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.173212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.173228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.173244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.175038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.175092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.175134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.175176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.175558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.175624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.175672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.175720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.175761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.176030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.176048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.176062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.176077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.176092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.178498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.178544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.178587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.178651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.178925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.178981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.179024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.179066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.179107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.179601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.179620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.179635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.179652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.179671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.181579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.181624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.181702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.181745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.182540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.184317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.184362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.184406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.184447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.184724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.184785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.184826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.184871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.184911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.185199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.185219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.185234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.185251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.185265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.187417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.187481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.187532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.187574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.187849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.187903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.733 [2024-07-15 12:19:00.187945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.187995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.188038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.188520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.188539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.188554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.188575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.188589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.190479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.190528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.190568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.190609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.190882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.190943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.190990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.191032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.191073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.191344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.191362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.191377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.191392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.191407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.193995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.194012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.194027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.194043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.196134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.198055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.198107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.198150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.198630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.198691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.199083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.199127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.199169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.199441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.199460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.199474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.199491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.199505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.201071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.201117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.201157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.201202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.201694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.201761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.203517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.203565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.205485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.205765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.205784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.205799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.205814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.205829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.207835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.207886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.208273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.208334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.208605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.208662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.209459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.209504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.209896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.210281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.210299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.210314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.210329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.210344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.213557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.213610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.214802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.214851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.215173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.215237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.217024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.217073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.218868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.219140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.219162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.219177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.219193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.219207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.221828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.221881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.223422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.223469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.223967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.224023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.734 [2024-07-15 12:19:00.224413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.224467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.226382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.226715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.226734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.226749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.226765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.226779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.230311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.230368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.232147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.232195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.232468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.232531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.234201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.234247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.235164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.235446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.235465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.235480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.235503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.235517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.238139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.238202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.238592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.238637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.238951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.239010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.240318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.240366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.242014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.242286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.242304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.242319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.242334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.242349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.245649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.245706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.247295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.249209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.249636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.249718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.251631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.251681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.252078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.252499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.252519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.252536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.252552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.252567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.256425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.257748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.259404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.261068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.261343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.261407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.262226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.263697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.265625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.265906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.265925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.265940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.265956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.265971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.267942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.268339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.269985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.270622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.271070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.271628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.273358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.275131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.276803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.277077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.277095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.277110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.277125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.277140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.280432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.282107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.283049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.284627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.284989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.285399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.285798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.287713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.288110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.288556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.288575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.288590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.288604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.288621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.291813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.293001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.294488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.295812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.296085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.298029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.299778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.735 [2024-07-15 12:19:00.300284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.302195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.302664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.302683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.302702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.302717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.302733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.305221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.306825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.308145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.309788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.310061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.311864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.312339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.314179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.315854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.316129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.316147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.316162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.316177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.316191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.318199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.736 [2024-07-15 12:19:00.318594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.319794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.320882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.321331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.321738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.323647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.324966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.326632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.326913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.326932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.326947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.326963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.326977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.330294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.331965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.333338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.334453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.334735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.335146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.335535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.337056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.337836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.338266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.338285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.338302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.338318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.338334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.341596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.343302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.344293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.345613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.345890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.347580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.349258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.350296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.351752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.352058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.352077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.352093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.352109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.352124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.354569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.355600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.356923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.358714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.358987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.360666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.361675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.363340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.364649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.364932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.364950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.364969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.364985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.364999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.367531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.367933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.368534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.370224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.370737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.371142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.372476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.373793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.375439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.375717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.375736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.375751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.375766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.375781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.998 [2024-07-15 12:19:00.379099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.380843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.382762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.383318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.383591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.384186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.384580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.385454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.386868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.387351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.387371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.387387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.387404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.387419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.389737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.391452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.393244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.393733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.394160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.394567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.395134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.396853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.397243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.397682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.397706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.397721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.397736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.397751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.401050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.401451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.401850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.402241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.402514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.403496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.403899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.404656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.406286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.406558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.406577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.406591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.406606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.406620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.409841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.411523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.413236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.415139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.415626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.416042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.416435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.417815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.418726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.419163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.419183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.419199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.419216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.419231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.422527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.424331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.425195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.426604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.426884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.428594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.430514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.430915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.431309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.431755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.431775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.431795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.431812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.431827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.435767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.437318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.438978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.440657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.440935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.441952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.443275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.445084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.446994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.447268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.447286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.447301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.447317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.447331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.451076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.451474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.451870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.453795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.454120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.455611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.457383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.459300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.460011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.460287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.460306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.460320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.460335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:46.999 [2024-07-15 12:19:00.460350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.462191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.462589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.462985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.464674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.465051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.465460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.466028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.467729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.469518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.469799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.469818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.469833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.469849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.469863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.473144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.474827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.476472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.477426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.477797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.478202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.478845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.480483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.480879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.481315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.481335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.481352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.481368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.481385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.484575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.485426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.487249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.488567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.488848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.490639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.492554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.492955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.493351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.493806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.493831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.493848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.493865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.493880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.497787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.499502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.501168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.502838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.503144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.504304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.505637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.507308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.509140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.509418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.509436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.509452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.509467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.509482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.512916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.513311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.513734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.515583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.515875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.517559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.519233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.520796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.521906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.522181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.522200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.522215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.522235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.522250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.524094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.524493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.524995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.526784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.527292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.527702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.528962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.530263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.531920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.000 [2024-07-15 12:19:00.532198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.532217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.532231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.532247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.532262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.535577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.537390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.539309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.539712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.540230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.540635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.542201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.542930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.543321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.543726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.543744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.543760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.543775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.543790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.547077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.547134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.548242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.549575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.549858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.551548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.551597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.553258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.553922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.554368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.554387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.554402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.554418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.554433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.556913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.558833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.560235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.561902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.562176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.564042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.564091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.565106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.565154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.565452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.565471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.565486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.565502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.565516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.567015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.567412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.567461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.567856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.568221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.569960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.570008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.570396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.570444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.570906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.570925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.570942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.570959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.570974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.572509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.574366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.574416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.575055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.575332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.576659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.576715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.578389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.578438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.578716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.578734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.578749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.578765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.578780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.580960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.582557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.582603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.582997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.583442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.585079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.585132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.586421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.586469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.586751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.586770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.586785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.586801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.586816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.588496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.590035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.001 [2024-07-15 12:19:00.590084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.591750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.592030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.593580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.593635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.594030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.594078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.594512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.594534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.594550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.594567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.594583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.596760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.598678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.598738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.600415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.600699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.602391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.602441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.603797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.603843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.604155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.604173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.604188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.604204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.604218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.605765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.606878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.606927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.606970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.607399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.607811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.607859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.609349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.609397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.609768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.609788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.609803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.609818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.609833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.611797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.611844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.611885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.611925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.612195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.613375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.613425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.613467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.613512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.613791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.613808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.613827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.613843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.613858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.615412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.615458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.615500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.615546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.616744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.619982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.620001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.620017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.620033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.620047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.621791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.266 [2024-07-15 12:19:00.621839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.621880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.621920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.622980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.625132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.625178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.625224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.625265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.625680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.625756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.625800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.625845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.625893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.626324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.626343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.626359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.626374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.626389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.628478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.628526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.628572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.628614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.629792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.632187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.632235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.632277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.632318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.632775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.632830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.632875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.632930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.632972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.633383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.633402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.633417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.633433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.633447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.635656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.635711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.635753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.635794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.636844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.639233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.639281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.639325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.639367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.639832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.639890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.639935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.639980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.640023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.640427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.640446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.640461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.640476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.640490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.643003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.643058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.643104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.643145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.643658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.643723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.643767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.643809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.643859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.644300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.644320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.644335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.644350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.644365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.646564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.646611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.267 [2024-07-15 12:19:00.646653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.646701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.647833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.650066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.650114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.650156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.650198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.650570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.650629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.650672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.650722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.650764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.651197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.651221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.651238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.651256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.651271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.653525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.653574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.653620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.653662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.654722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.656856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.656903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.656946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.656987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.657402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.657464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.657507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.657549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.657590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.658079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.658098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.658115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.658131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.658150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.660339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.660385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.660427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.660469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.660898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.660975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.661024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.661065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.661109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.661582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.661602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.661619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.661637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.661654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.663776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.663824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.663865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.663907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.664351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.664406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.664449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.664509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.664557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.665073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.665097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.665112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.665129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.665144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.667334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.667386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.667430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.667471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.667910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.667966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.668009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.668068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.668119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.668643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.668666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.668682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.668706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.668722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.670916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.670963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.671003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.671045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.268 [2024-07-15 12:19:00.671484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.671538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.671581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.671641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.671696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.672206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.672226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.672244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.672260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.672275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.674469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.674516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.674557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.674598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.675830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.677587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.677636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.677678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.677727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.677999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.678057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.678101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.678145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.678187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.678621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.678639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.678654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.678670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.678694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.680252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.680300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.680345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.680403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.680887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.680945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.680993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.681040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.681081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.681430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.681448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.681463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.681479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.681494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.683466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.683512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.683552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.683593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.683867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.683928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.683971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.684012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.684053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.684454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.684473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.684488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.684504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.684518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.686057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.686119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.686160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.686203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.686552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.686618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.686661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.686711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.686752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.687209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.687230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.687246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.687262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.687279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.689709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.689770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.689811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.689853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.690908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.693306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.693712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.693765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.693807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.694258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.269 [2024-07-15 12:19:00.694315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.694724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.694778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.694823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.695268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.695287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.695302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.695324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.695339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.697481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.697528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.697588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.697642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.698131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.698196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.698587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.698646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.699046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.699380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.699399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.699415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.699430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.699445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.702106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.702162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.702553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.702604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.702967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.703034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.703424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.703471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.703887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.704350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.704369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.704386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.704402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.704418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.707038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.707092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.707482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.707528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.707882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.707948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.708341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.708392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.708793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.709277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.709300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.709316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.709333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.709348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.711907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.711961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.712352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.712398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.712798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.712864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.713254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.713299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.713706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.714064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.714083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.714100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.714116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.714132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.717049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.717104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.717496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.717552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.717923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.717992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.718387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.718438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.718834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.719236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.719254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.719270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.719286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.719301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.270 [2024-07-15 12:19:00.721995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.722049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.722440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.722498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.722896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.722964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.723368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.723427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.723826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.724184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.724202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.724218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.724234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.724250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.726792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.726848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.727256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.727653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.727939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.728013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.729151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.729199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.729590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.729964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.729983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.729998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.730014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.730030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.732630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.733035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.733432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.733834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.734167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.734229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.735534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.737339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.739250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.739526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.739546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.739561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.739577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.739591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.743271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.743674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.744076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.744483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.744932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.746869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.748356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.750028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.751689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.751974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.751994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.752009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.752025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.752040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.755277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.756458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.756859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.757253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.757749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.758723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.760161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.761467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.762761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.763038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.763056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.763071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.763086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.763100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.766436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.768039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.769155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.770470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.770756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.771164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.771559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.771958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.772353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.772707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.772730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.772746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.772761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.772776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.776084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.776618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.777827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.779165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.779463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.781081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.782826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.784141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.785402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.785857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.785876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.785892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.785908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.785926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.789585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.271 [2024-07-15 12:19:00.790710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.792124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.794019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.794300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.795741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.796142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.797517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.798734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.799010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.799028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.799043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.799058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.799076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.801095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.801493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.801896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.802546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.802834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.803791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.805534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.807213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.808335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.808611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.808629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.808645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.808661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.808676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.814885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.815293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:47.272 [2024-07-15 12:19:00.815314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:37:48.208 00:37:48.208 Latency(us) 00:37:48.208 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:48.208 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:48.208 Verification LBA range: start 0x0 length 0x100 00:37:48.208 crypto_ram : 6.07 42.19 2.64 0.00 0.00 2953251.62 260776.29 2494699.07 00:37:48.208 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:48.208 Verification LBA range: start 0x100 length 0x100 00:37:48.208 crypto_ram : 5.97 31.98 2.00 0.00 0.00 3669049.30 82062.47 3092843.30 00:37:48.208 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:48.208 Verification LBA range: start 0x0 length 0x100 00:37:48.208 crypto_ram1 : 6.07 42.18 2.64 0.00 0.00 2848835.67 260776.29 2305043.59 00:37:48.208 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:48.208 Verification LBA range: start 0x100 length 0x100 00:37:48.208 crypto_ram1 : 6.01 34.96 2.18 0.00 0.00 3277222.25 71576.71 2844832.28 00:37:48.208 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:48.208 Verification LBA range: start 0x0 length 0x100 00:37:48.208 crypto_ram2 : 5.62 257.80 16.11 0.00 0.00 441233.26 23706.94 605438.66 00:37:48.208 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:48.208 Verification LBA range: start 0x100 length 0x100 00:37:48.208 crypto_ram2 : 5.69 211.46 13.22 0.00 0.00 529339.56 39891.48 682030.30 00:37:48.208 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:37:48.208 Verification LBA range: start 0x0 length 0x100 00:37:48.208 crypto_ram3 : 5.72 268.40 16.78 0.00 0.00 411708.72 42170.99 465020.66 00:37:48.208 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:37:48.208 Verification LBA range: start 0x100 length 0x100 00:37:48.208 crypto_ram3 : 5.85 227.78 14.24 0.00 0.00 475903.88 27126.21 641910.87 00:37:48.208 =================================================================================================================== 00:37:48.208 Total : 1116.75 69.80 0.00 0.00 841990.06 23706.94 3092843.30 00:37:48.467 00:37:48.467 real 0m9.252s 00:37:48.467 user 0m17.525s 00:37:48.467 sys 0m0.476s 00:37:48.467 12:19:01 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:48.467 12:19:01 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:37:48.467 ************************************ 00:37:48.467 END TEST bdev_verify_big_io 00:37:48.467 ************************************ 00:37:48.467 12:19:01 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:37:48.467 12:19:01 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:48.467 12:19:01 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:37:48.467 12:19:01 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:48.467 12:19:01 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:48.467 ************************************ 00:37:48.467 START TEST bdev_write_zeroes 00:37:48.467 ************************************ 00:37:48.467 12:19:01 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:48.467 [2024-07-15 12:19:01.996610] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:37:48.467 [2024-07-15 12:19:01.996673] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1675085 ] 00:37:48.726 [2024-07-15 12:19:02.125211] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:48.726 [2024-07-15 12:19:02.229405] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:48.726 [2024-07-15 12:19:02.250669] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:37:48.726 [2024-07-15 12:19:02.258702] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:37:48.726 [2024-07-15 12:19:02.266721] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:37:48.984 [2024-07-15 12:19:02.372516] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:37:51.518 [2024-07-15 12:19:04.579807] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:37:51.518 [2024-07-15 12:19:04.579867] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:37:51.518 [2024-07-15 12:19:04.579882] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:51.518 [2024-07-15 12:19:04.587826] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:37:51.518 [2024-07-15 12:19:04.587846] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:37:51.518 [2024-07-15 12:19:04.587858] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:51.518 [2024-07-15 12:19:04.595846] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:37:51.518 [2024-07-15 12:19:04.595864] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:37:51.518 [2024-07-15 12:19:04.595880] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:51.518 [2024-07-15 12:19:04.603867] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:37:51.518 [2024-07-15 12:19:04.603884] bdev.c:8157:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:37:51.518 [2024-07-15 12:19:04.603896] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:37:51.518 Running I/O for 1 seconds... 00:37:52.455 00:37:52.455 Latency(us) 00:37:52.455 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:37:52.455 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:52.455 crypto_ram : 1.02 2014.01 7.87 0.00 0.00 63009.62 5584.81 76135.74 00:37:52.455 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:52.455 crypto_ram1 : 1.03 2027.12 7.92 0.00 0.00 62318.06 5556.31 70664.90 00:37:52.455 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:52.455 crypto_ram2 : 1.02 15560.71 60.78 0.00 0.00 8108.23 2421.98 10656.72 00:37:52.455 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:37:52.455 crypto_ram3 : 1.02 15539.44 60.70 0.00 0.00 8082.93 2436.23 8491.19 00:37:52.455 =================================================================================================================== 00:37:52.455 Total : 35141.28 137.27 0.00 0.00 14396.44 2421.98 76135.74 00:37:52.713 00:37:52.713 real 0m4.205s 00:37:52.713 user 0m3.785s 00:37:52.713 sys 0m0.371s 00:37:52.713 12:19:06 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:52.713 12:19:06 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:37:52.713 ************************************ 00:37:52.713 END TEST bdev_write_zeroes 00:37:52.713 ************************************ 00:37:52.713 12:19:06 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 0 00:37:52.713 12:19:06 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:52.713 12:19:06 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:37:52.713 12:19:06 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:52.713 12:19:06 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:52.713 ************************************ 00:37:52.713 START TEST bdev_json_nonenclosed 00:37:52.713 ************************************ 00:37:52.713 12:19:06 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:52.972 [2024-07-15 12:19:06.330443] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:37:52.972 [2024-07-15 12:19:06.330576] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1675630 ] 00:37:52.972 [2024-07-15 12:19:06.531336] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:53.231 [2024-07-15 12:19:06.633678] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:53.231 [2024-07-15 12:19:06.633747] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:37:53.231 [2024-07-15 12:19:06.633768] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:37:53.231 [2024-07-15 12:19:06.633780] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:37:53.231 00:37:53.231 real 0m0.513s 00:37:53.231 user 0m0.291s 00:37:53.231 sys 0m0.218s 00:37:53.231 12:19:06 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1123 -- # es=234 00:37:53.231 12:19:06 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:53.231 12:19:06 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:37:53.231 ************************************ 00:37:53.231 END TEST bdev_json_nonenclosed 00:37:53.231 ************************************ 00:37:53.231 12:19:06 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:37:53.231 12:19:06 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # true 00:37:53.231 12:19:06 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:53.231 12:19:06 blockdev_crypto_qat -- common/autotest_common.sh@1099 -- # '[' 13 -le 1 ']' 00:37:53.231 12:19:06 blockdev_crypto_qat -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:53.231 12:19:06 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:53.231 ************************************ 00:37:53.231 START TEST bdev_json_nonarray 00:37:53.231 ************************************ 00:37:53.231 12:19:06 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:37:53.489 [2024-07-15 12:19:06.925507] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:37:53.490 [2024-07-15 12:19:06.925639] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1675814 ] 00:37:53.747 [2024-07-15 12:19:07.119679] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:37:53.747 [2024-07-15 12:19:07.220126] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:37:53.747 [2024-07-15 12:19:07.220201] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:37:53.747 [2024-07-15 12:19:07.220222] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:37:53.747 [2024-07-15 12:19:07.220235] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:37:53.747 00:37:53.747 real 0m0.503s 00:37:53.747 user 0m0.286s 00:37:53.747 sys 0m0.213s 00:37:53.747 12:19:07 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1123 -- # es=234 00:37:53.747 12:19:07 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:53.747 12:19:07 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:37:53.747 ************************************ 00:37:53.747 END TEST bdev_json_nonarray 00:37:53.747 ************************************ 00:37:54.007 12:19:07 blockdev_crypto_qat -- common/autotest_common.sh@1142 -- # return 234 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # true 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:37:54.007 12:19:07 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:37:54.007 00:37:54.007 real 1m14.269s 00:37:54.007 user 2m43.995s 00:37:54.007 sys 0m9.719s 00:37:54.007 12:19:07 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # xtrace_disable 00:37:54.007 12:19:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:37:54.007 ************************************ 00:37:54.007 END TEST blockdev_crypto_qat 00:37:54.007 ************************************ 00:37:54.007 12:19:07 -- common/autotest_common.sh@1142 -- # return 0 00:37:54.007 12:19:07 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:37:54.007 12:19:07 -- common/autotest_common.sh@1099 -- # '[' 2 -le 1 ']' 00:37:54.007 12:19:07 -- common/autotest_common.sh@1105 -- # xtrace_disable 00:37:54.007 12:19:07 -- common/autotest_common.sh@10 -- # set +x 00:37:54.007 ************************************ 00:37:54.007 START TEST chaining 00:37:54.007 ************************************ 00:37:54.007 12:19:07 chaining -- common/autotest_common.sh@1123 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:37:54.007 * Looking for test storage... 00:37:54.007 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:37:54.007 12:19:07 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@7 -- # uname -s 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:005d867c-174e-e711-906e-0012795d9712 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=005d867c-174e-e711-906e-0012795d9712 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:37:54.007 12:19:07 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:37:54.007 12:19:07 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:37:54.007 12:19:07 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:37:54.007 12:19:07 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:37:54.007 12:19:07 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:54.007 12:19:07 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:54.007 12:19:07 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:54.007 12:19:07 chaining -- paths/export.sh@5 -- # export PATH 00:37:54.267 12:19:07 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@47 -- # : 0 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:37:54.267 12:19:07 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:37:54.267 12:19:07 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:37:54.267 12:19:07 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:37:54.267 12:19:07 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:37:54.267 12:19:07 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:37:54.267 12:19:07 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:37:54.267 12:19:07 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:37:54.267 12:19:07 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:37:54.267 12:19:07 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:37:54.267 12:19:07 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@296 -- # e810=() 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@297 -- # x722=() 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@298 -- # mlx=() 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@336 -- # return 1 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:38:02.384 WARNING: No supported devices were found, fallback requested for tcp test 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:38:02.384 Cannot find device "nvmf_tgt_br" 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@155 -- # true 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:38:02.384 Cannot find device "nvmf_tgt_br2" 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@156 -- # true 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:38:02.384 Cannot find device "nvmf_tgt_br" 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@158 -- # true 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:38:02.384 Cannot find device "nvmf_tgt_br2" 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@159 -- # true 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:38:02.384 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@162 -- # true 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:38:02.384 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@163 -- # true 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:38:02.384 12:19:15 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:38:02.385 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:38:02.385 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.090 ms 00:38:02.385 00:38:02.385 --- 10.0.0.2 ping statistics --- 00:38:02.385 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:02.385 rtt min/avg/max/mdev = 0.090/0.090/0.090/0.000 ms 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:38:02.385 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:38:02.385 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.058 ms 00:38:02.385 00:38:02.385 --- 10.0.0.3 ping statistics --- 00:38:02.385 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:02.385 rtt min/avg/max/mdev = 0.058/0.058/0.058/0.000 ms 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:38:02.385 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:38:02.385 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.028 ms 00:38:02.385 00:38:02.385 --- 10.0.0.1 ping statistics --- 00:38:02.385 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:02.385 rtt min/avg/max/mdev = 0.028/0.028/0.028/0.000 ms 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@433 -- # return 0 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:38:02.385 12:19:15 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:38:02.385 12:19:15 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:38:02.385 12:19:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@481 -- # nvmfpid=1679478 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:38:02.385 12:19:15 chaining -- nvmf/common.sh@482 -- # waitforlisten 1679478 00:38:02.385 12:19:15 chaining -- common/autotest_common.sh@829 -- # '[' -z 1679478 ']' 00:38:02.385 12:19:15 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:38:02.385 12:19:15 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:02.385 12:19:15 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:38:02.385 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:38:02.385 12:19:15 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:02.385 12:19:15 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:02.385 [2024-07-15 12:19:15.890081] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:02.385 [2024-07-15 12:19:15.890143] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:38:02.644 [2024-07-15 12:19:16.030727] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:02.644 [2024-07-15 12:19:16.152759] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:38:02.644 [2024-07-15 12:19:16.152813] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:38:02.644 [2024-07-15 12:19:16.152831] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:38:02.644 [2024-07-15 12:19:16.152848] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:38:02.644 [2024-07-15 12:19:16.152862] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:38:02.644 [2024-07-15 12:19:16.152900] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:38:03.581 12:19:16 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:03.581 12:19:16 chaining -- common/autotest_common.sh@862 -- # return 0 00:38:03.581 12:19:16 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:38:03.581 12:19:16 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:38:03.581 12:19:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:03.581 12:19:16 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@69 -- # mktemp 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.SwOlBUQqdC 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@69 -- # mktemp 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.Zp1gFyYIn4 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:38:03.581 12:19:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:03.581 12:19:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:03.581 malloc0 00:38:03.581 true 00:38:03.581 true 00:38:03.581 [2024-07-15 12:19:16.937787] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:38:03.581 crypto0 00:38:03.581 [2024-07-15 12:19:16.945815] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:38:03.581 crypto1 00:38:03.581 [2024-07-15 12:19:16.953962] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:38:03.581 [2024-07-15 12:19:16.970231] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:38:03.581 12:19:16 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@85 -- # update_stats 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:38:03.581 12:19:16 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:03.581 12:19:16 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:03.581 12:19:16 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:38:03.581 12:19:17 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:03.581 12:19:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:03.840 12:19:17 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:38:03.840 12:19:17 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.SwOlBUQqdC bs=1K count=64 00:38:03.840 64+0 records in 00:38:03.840 64+0 records out 00:38:03.840 65536 bytes (66 kB, 64 KiB) copied, 0.00106422 s, 61.6 MB/s 00:38:03.840 12:19:17 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.SwOlBUQqdC --ob Nvme0n1 --bs 65536 --count 1 00:38:03.840 12:19:17 chaining -- bdev/chaining.sh@25 -- # local config 00:38:03.840 12:19:17 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:38:03.840 12:19:17 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:38:03.840 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:38:03.840 12:19:17 chaining -- bdev/chaining.sh@31 -- # config='{ 00:38:03.840 "subsystems": [ 00:38:03.840 { 00:38:03.840 "subsystem": "bdev", 00:38:03.840 "config": [ 00:38:03.840 { 00:38:03.840 "method": "bdev_nvme_attach_controller", 00:38:03.840 "params": { 00:38:03.840 "trtype": "tcp", 00:38:03.840 "adrfam": "IPv4", 00:38:03.840 "name": "Nvme0", 00:38:03.840 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:03.840 "traddr": "10.0.0.2", 00:38:03.840 "trsvcid": "4420" 00:38:03.840 } 00:38:03.840 }, 00:38:03.840 { 00:38:03.840 "method": "bdev_set_options", 00:38:03.840 "params": { 00:38:03.840 "bdev_auto_examine": false 00:38:03.840 } 00:38:03.840 } 00:38:03.840 ] 00:38:03.840 } 00:38:03.840 ] 00:38:03.840 }' 00:38:03.840 12:19:17 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.SwOlBUQqdC --ob Nvme0n1 --bs 65536 --count 1 00:38:03.840 12:19:17 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:38:03.840 "subsystems": [ 00:38:03.840 { 00:38:03.840 "subsystem": "bdev", 00:38:03.840 "config": [ 00:38:03.840 { 00:38:03.840 "method": "bdev_nvme_attach_controller", 00:38:03.840 "params": { 00:38:03.840 "trtype": "tcp", 00:38:03.840 "adrfam": "IPv4", 00:38:03.840 "name": "Nvme0", 00:38:03.840 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:03.840 "traddr": "10.0.0.2", 00:38:03.840 "trsvcid": "4420" 00:38:03.840 } 00:38:03.840 }, 00:38:03.840 { 00:38:03.840 "method": "bdev_set_options", 00:38:03.840 "params": { 00:38:03.841 "bdev_auto_examine": false 00:38:03.841 } 00:38:03.841 } 00:38:03.841 ] 00:38:03.841 } 00:38:03.841 ] 00:38:03.841 }' 00:38:03.841 [2024-07-15 12:19:17.292504] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:03.841 [2024-07-15 12:19:17.292574] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1679704 ] 00:38:03.841 [2024-07-15 12:19:17.424669] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:04.099 [2024-07-15 12:19:17.527218] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:04.617  Copying: 64/64 [kB] (average 20 MBps) 00:38:04.617 00:38:04.617 12:19:17 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:38:04.617 12:19:17 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:04.617 12:19:17 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:04.617 12:19:17 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:04.617 12:19:17 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:04.617 12:19:17 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:04.617 12:19:17 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:38:04.617 12:19:17 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:04.617 12:19:17 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:04.617 12:19:17 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.617 12:19:17 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@96 -- # update_stats 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:04.617 12:19:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:04.617 12:19:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:04.876 12:19:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:04.876 12:19:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:04.876 12:19:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.876 12:19:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:38:04.876 12:19:18 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:04.876 12:19:18 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:04.876 12:19:18 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.Zp1gFyYIn4 --ib Nvme0n1 --bs 65536 --count 1 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@25 -- # local config 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:38:04.876 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@31 -- # config='{ 00:38:04.876 "subsystems": [ 00:38:04.876 { 00:38:04.876 "subsystem": "bdev", 00:38:04.876 "config": [ 00:38:04.876 { 00:38:04.876 "method": "bdev_nvme_attach_controller", 00:38:04.876 "params": { 00:38:04.876 "trtype": "tcp", 00:38:04.876 "adrfam": "IPv4", 00:38:04.876 "name": "Nvme0", 00:38:04.876 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:04.876 "traddr": "10.0.0.2", 00:38:04.876 "trsvcid": "4420" 00:38:04.876 } 00:38:04.876 }, 00:38:04.876 { 00:38:04.876 "method": "bdev_set_options", 00:38:04.876 "params": { 00:38:04.876 "bdev_auto_examine": false 00:38:04.876 } 00:38:04.876 } 00:38:04.876 ] 00:38:04.876 } 00:38:04.876 ] 00:38:04.876 }' 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.Zp1gFyYIn4 --ib Nvme0n1 --bs 65536 --count 1 00:38:04.876 12:19:18 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:38:04.876 "subsystems": [ 00:38:04.876 { 00:38:04.876 "subsystem": "bdev", 00:38:04.876 "config": [ 00:38:04.876 { 00:38:04.876 "method": "bdev_nvme_attach_controller", 00:38:04.876 "params": { 00:38:04.876 "trtype": "tcp", 00:38:04.876 "adrfam": "IPv4", 00:38:04.876 "name": "Nvme0", 00:38:04.876 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:04.876 "traddr": "10.0.0.2", 00:38:04.876 "trsvcid": "4420" 00:38:04.876 } 00:38:04.876 }, 00:38:04.876 { 00:38:04.876 "method": "bdev_set_options", 00:38:04.876 "params": { 00:38:04.876 "bdev_auto_examine": false 00:38:04.876 } 00:38:04.876 } 00:38:04.876 ] 00:38:04.876 } 00:38:04.876 ] 00:38:04.876 }' 00:38:04.876 [2024-07-15 12:19:18.465108] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:04.876 [2024-07-15 12:19:18.465181] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1679915 ] 00:38:05.135 [2024-07-15 12:19:18.593962] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:05.135 [2024-07-15 12:19:18.690062] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:05.662  Copying: 64/64 [kB] (average 20 MBps) 00:38:05.662 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:05.662 12:19:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:05.662 12:19:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:05.662 12:19:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:05.662 12:19:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:05.662 12:19:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:05.662 12:19:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:05.662 12:19:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:05.662 12:19:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:05.662 12:19:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:05.662 12:19:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:38:05.921 12:19:19 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:05.921 12:19:19 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:05.921 12:19:19 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.SwOlBUQqdC /tmp/tmp.Zp1gFyYIn4 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@25 -- # local config 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:38:05.921 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@31 -- # config='{ 00:38:05.921 "subsystems": [ 00:38:05.921 { 00:38:05.921 "subsystem": "bdev", 00:38:05.921 "config": [ 00:38:05.921 { 00:38:05.921 "method": "bdev_nvme_attach_controller", 00:38:05.921 "params": { 00:38:05.921 "trtype": "tcp", 00:38:05.921 "adrfam": "IPv4", 00:38:05.921 "name": "Nvme0", 00:38:05.921 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:05.921 "traddr": "10.0.0.2", 00:38:05.921 "trsvcid": "4420" 00:38:05.921 } 00:38:05.921 }, 00:38:05.921 { 00:38:05.921 "method": "bdev_set_options", 00:38:05.921 "params": { 00:38:05.921 "bdev_auto_examine": false 00:38:05.921 } 00:38:05.921 } 00:38:05.921 ] 00:38:05.921 } 00:38:05.921 ] 00:38:05.921 }' 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:38:05.921 12:19:19 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:38:05.921 "subsystems": [ 00:38:05.921 { 00:38:05.921 "subsystem": "bdev", 00:38:05.921 "config": [ 00:38:05.921 { 00:38:05.921 "method": "bdev_nvme_attach_controller", 00:38:05.921 "params": { 00:38:05.921 "trtype": "tcp", 00:38:05.921 "adrfam": "IPv4", 00:38:05.921 "name": "Nvme0", 00:38:05.921 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:05.921 "traddr": "10.0.0.2", 00:38:05.921 "trsvcid": "4420" 00:38:05.921 } 00:38:05.921 }, 00:38:05.921 { 00:38:05.921 "method": "bdev_set_options", 00:38:05.921 "params": { 00:38:05.921 "bdev_auto_examine": false 00:38:05.921 } 00:38:05.921 } 00:38:05.921 ] 00:38:05.921 } 00:38:05.921 ] 00:38:05.921 }' 00:38:05.921 [2024-07-15 12:19:19.432627] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:05.921 [2024-07-15 12:19:19.432705] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1680083 ] 00:38:06.181 [2024-07-15 12:19:19.559969] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:06.181 [2024-07-15 12:19:19.656466] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:06.700  Copying: 64/64 [kB] (average 20 MBps) 00:38:06.700 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@106 -- # update_stats 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:06.700 12:19:20 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.SwOlBUQqdC --ob Nvme0n1 --bs 4096 --count 16 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@25 -- # local config 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:38:06.700 12:19:20 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:38:06.700 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:38:06.959 12:19:20 chaining -- bdev/chaining.sh@31 -- # config='{ 00:38:06.959 "subsystems": [ 00:38:06.959 { 00:38:06.959 "subsystem": "bdev", 00:38:06.959 "config": [ 00:38:06.959 { 00:38:06.959 "method": "bdev_nvme_attach_controller", 00:38:06.959 "params": { 00:38:06.959 "trtype": "tcp", 00:38:06.959 "adrfam": "IPv4", 00:38:06.959 "name": "Nvme0", 00:38:06.959 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:06.959 "traddr": "10.0.0.2", 00:38:06.959 "trsvcid": "4420" 00:38:06.959 } 00:38:06.959 }, 00:38:06.959 { 00:38:06.959 "method": "bdev_set_options", 00:38:06.959 "params": { 00:38:06.959 "bdev_auto_examine": false 00:38:06.959 } 00:38:06.959 } 00:38:06.959 ] 00:38:06.959 } 00:38:06.959 ] 00:38:06.959 }' 00:38:06.959 12:19:20 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.SwOlBUQqdC --ob Nvme0n1 --bs 4096 --count 16 00:38:06.959 12:19:20 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:38:06.959 "subsystems": [ 00:38:06.959 { 00:38:06.959 "subsystem": "bdev", 00:38:06.959 "config": [ 00:38:06.959 { 00:38:06.959 "method": "bdev_nvme_attach_controller", 00:38:06.959 "params": { 00:38:06.959 "trtype": "tcp", 00:38:06.959 "adrfam": "IPv4", 00:38:06.959 "name": "Nvme0", 00:38:06.959 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:06.959 "traddr": "10.0.0.2", 00:38:06.959 "trsvcid": "4420" 00:38:06.959 } 00:38:06.959 }, 00:38:06.959 { 00:38:06.959 "method": "bdev_set_options", 00:38:06.959 "params": { 00:38:06.959 "bdev_auto_examine": false 00:38:06.959 } 00:38:06.959 } 00:38:06.959 ] 00:38:06.959 } 00:38:06.959 ] 00:38:06.959 }' 00:38:06.959 [2024-07-15 12:19:20.364952] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:06.959 [2024-07-15 12:19:20.365020] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1680144 ] 00:38:06.959 [2024-07-15 12:19:20.498363] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:07.218 [2024-07-15 12:19:20.602712] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:07.477  Copying: 64/64 [kB] (average 9142 kBps) 00:38:07.477 00:38:07.477 12:19:21 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:38:07.477 12:19:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:07.477 12:19:21 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:07.477 12:19:21 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:07.477 12:19:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:07.477 12:19:21 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:07.477 12:19:21 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:38:07.477 12:19:21 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:07.477 12:19:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:07.477 12:19:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:07.477 12:19:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:07.735 12:19:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:07.735 12:19:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:07.735 12:19:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:07.735 12:19:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:07.735 12:19:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:07.735 12:19:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:38:07.735 12:19:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:07.735 12:19:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:07.735 12:19:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@114 -- # update_stats 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:07.735 12:19:21 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:07.736 12:19:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:07.736 12:19:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:07.736 12:19:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:07.736 12:19:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:07.736 12:19:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:07.736 12:19:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:07.736 12:19:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:07.995 12:19:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:07.995 12:19:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:07.995 12:19:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:38:07.995 12:19:21 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:07.995 12:19:21 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:07.995 12:19:21 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@117 -- # : 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.Zp1gFyYIn4 --ib Nvme0n1 --bs 4096 --count 16 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@25 -- # local config 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:38:07.995 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@31 -- # config='{ 00:38:07.995 "subsystems": [ 00:38:07.995 { 00:38:07.995 "subsystem": "bdev", 00:38:07.995 "config": [ 00:38:07.995 { 00:38:07.995 "method": "bdev_nvme_attach_controller", 00:38:07.995 "params": { 00:38:07.995 "trtype": "tcp", 00:38:07.995 "adrfam": "IPv4", 00:38:07.995 "name": "Nvme0", 00:38:07.995 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:07.995 "traddr": "10.0.0.2", 00:38:07.995 "trsvcid": "4420" 00:38:07.995 } 00:38:07.995 }, 00:38:07.995 { 00:38:07.995 "method": "bdev_set_options", 00:38:07.995 "params": { 00:38:07.995 "bdev_auto_examine": false 00:38:07.995 } 00:38:07.995 } 00:38:07.995 ] 00:38:07.995 } 00:38:07.995 ] 00:38:07.995 }' 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.Zp1gFyYIn4 --ib Nvme0n1 --bs 4096 --count 16 00:38:07.995 12:19:21 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:38:07.995 "subsystems": [ 00:38:07.995 { 00:38:07.995 "subsystem": "bdev", 00:38:07.995 "config": [ 00:38:07.995 { 00:38:07.995 "method": "bdev_nvme_attach_controller", 00:38:07.995 "params": { 00:38:07.995 "trtype": "tcp", 00:38:07.995 "adrfam": "IPv4", 00:38:07.995 "name": "Nvme0", 00:38:07.995 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:38:07.995 "traddr": "10.0.0.2", 00:38:07.995 "trsvcid": "4420" 00:38:07.995 } 00:38:07.995 }, 00:38:07.995 { 00:38:07.995 "method": "bdev_set_options", 00:38:07.995 "params": { 00:38:07.995 "bdev_auto_examine": false 00:38:07.995 } 00:38:07.995 } 00:38:07.995 ] 00:38:07.995 } 00:38:07.995 ] 00:38:07.995 }' 00:38:07.995 [2024-07-15 12:19:21.534287] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:07.995 [2024-07-15 12:19:21.534351] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1680354 ] 00:38:08.253 [2024-07-15 12:19:21.661033] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:08.254 [2024-07-15 12:19:21.757181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:08.771  Copying: 64/64 [kB] (average 1306 kBps) 00:38:08.771 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:08.771 12:19:22 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:08.771 12:19:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:08.771 12:19:22 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:08.771 12:19:22 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:08.771 12:19:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:08.771 12:19:22 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:08.771 12:19:22 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:08.771 12:19:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:08.771 12:19:22 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:38:08.771 12:19:22 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:38:08.772 12:19:22 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:38:08.772 12:19:22 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:38:08.772 12:19:22 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:08.772 12:19:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:09.031 12:19:22 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:38:09.031 12:19:22 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.SwOlBUQqdC /tmp/tmp.Zp1gFyYIn4 00:38:09.031 12:19:22 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:38:09.031 12:19:22 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:38:09.031 12:19:22 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.SwOlBUQqdC /tmp/tmp.Zp1gFyYIn4 00:38:09.031 12:19:22 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@117 -- # sync 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@120 -- # set +e 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:38:09.031 rmmod nvme_tcp 00:38:09.031 rmmod nvme_fabrics 00:38:09.031 rmmod nvme_keyring 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@124 -- # set -e 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@125 -- # return 0 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@489 -- # '[' -n 1679478 ']' 00:38:09.031 12:19:22 chaining -- nvmf/common.sh@490 -- # killprocess 1679478 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@948 -- # '[' -z 1679478 ']' 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@952 -- # kill -0 1679478 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@953 -- # uname 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1679478 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1679478' 00:38:09.031 killing process with pid 1679478 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@967 -- # kill 1679478 00:38:09.031 12:19:22 chaining -- common/autotest_common.sh@972 -- # wait 1679478 00:38:09.291 12:19:22 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:38:09.291 12:19:22 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:38:09.291 12:19:22 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:38:09.291 12:19:22 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:38:09.291 12:19:22 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:38:09.291 12:19:22 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:38:09.291 12:19:22 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:38:09.291 12:19:22 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:38:09.291 12:19:22 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:38:09.548 12:19:22 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:38:09.548 12:19:22 chaining -- bdev/chaining.sh@132 -- # bperfpid=1680570 00:38:09.548 12:19:22 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:38:09.548 12:19:22 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1680570 00:38:09.548 12:19:22 chaining -- common/autotest_common.sh@829 -- # '[' -z 1680570 ']' 00:38:09.548 12:19:22 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:38:09.548 12:19:22 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:09.548 12:19:22 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:38:09.548 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:38:09.548 12:19:22 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:09.548 12:19:22 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:09.548 [2024-07-15 12:19:22.951284] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:09.548 [2024-07-15 12:19:22.951358] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1680570 ] 00:38:09.548 [2024-07-15 12:19:23.083087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:09.816 [2024-07-15 12:19:23.189296] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:10.423 12:19:23 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:10.423 12:19:23 chaining -- common/autotest_common.sh@862 -- # return 0 00:38:10.423 12:19:23 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:38:10.423 12:19:23 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:10.423 12:19:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:10.681 malloc0 00:38:10.681 true 00:38:10.681 true 00:38:10.681 [2024-07-15 12:19:24.047184] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:38:10.681 crypto0 00:38:10.681 [2024-07-15 12:19:24.055221] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:38:10.681 crypto1 00:38:10.681 12:19:24 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:10.681 12:19:24 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:38:10.681 Running I/O for 5 seconds... 00:38:15.956 00:38:15.956 Latency(us) 00:38:15.956 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:15.956 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:38:15.956 Verification LBA range: start 0x0 length 0x2000 00:38:15.956 crypto1 : 5.01 11473.99 44.82 0.00 0.00 22243.06 5527.82 14360.93 00:38:15.956 =================================================================================================================== 00:38:15.956 Total : 11473.99 44.82 0.00 0.00 22243.06 5527.82 14360.93 00:38:15.956 0 00:38:15.956 12:19:29 chaining -- bdev/chaining.sh@146 -- # killprocess 1680570 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@948 -- # '[' -z 1680570 ']' 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@952 -- # kill -0 1680570 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@953 -- # uname 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1680570 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1680570' 00:38:15.956 killing process with pid 1680570 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@967 -- # kill 1680570 00:38:15.956 Received shutdown signal, test time was about 5.000000 seconds 00:38:15.956 00:38:15.956 Latency(us) 00:38:15.956 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:15.956 =================================================================================================================== 00:38:15.956 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@972 -- # wait 1680570 00:38:15.956 12:19:29 chaining -- bdev/chaining.sh@152 -- # bperfpid=1681446 00:38:15.956 12:19:29 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:38:15.956 12:19:29 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1681446 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@829 -- # '[' -z 1681446 ']' 00:38:15.956 12:19:29 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:38:15.957 12:19:29 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:15.957 12:19:29 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:38:15.957 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:38:15.957 12:19:29 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:15.957 12:19:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:16.214 [2024-07-15 12:19:29.605153] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:16.214 [2024-07-15 12:19:29.605289] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1681446 ] 00:38:16.214 [2024-07-15 12:19:29.797504] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:16.471 [2024-07-15 12:19:29.894494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:17.036 12:19:30 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:17.036 12:19:30 chaining -- common/autotest_common.sh@862 -- # return 0 00:38:17.036 12:19:30 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:38:17.036 12:19:30 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:17.036 12:19:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:17.036 malloc0 00:38:17.036 true 00:38:17.036 true 00:38:17.334 [2024-07-15 12:19:30.633817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:38:17.334 [2024-07-15 12:19:30.633869] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:38:17.334 [2024-07-15 12:19:30.633889] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13bb700 00:38:17.334 [2024-07-15 12:19:30.633901] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:38:17.334 [2024-07-15 12:19:30.634991] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:38:17.334 [2024-07-15 12:19:30.635019] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:38:17.334 pt0 00:38:17.334 [2024-07-15 12:19:30.641847] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:38:17.334 crypto0 00:38:17.334 [2024-07-15 12:19:30.649868] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:38:17.334 crypto1 00:38:17.334 12:19:30 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:17.334 12:19:30 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:38:17.334 Running I/O for 5 seconds... 00:38:22.599 00:38:22.599 Latency(us) 00:38:22.599 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:22.599 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:38:22.599 Verification LBA range: start 0x0 length 0x2000 00:38:22.599 crypto1 : 5.02 9082.92 35.48 0.00 0.00 28108.63 6525.11 16982.37 00:38:22.599 =================================================================================================================== 00:38:22.599 Total : 9082.92 35.48 0.00 0.00 28108.63 6525.11 16982.37 00:38:22.599 0 00:38:22.599 12:19:35 chaining -- bdev/chaining.sh@167 -- # killprocess 1681446 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@948 -- # '[' -z 1681446 ']' 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@952 -- # kill -0 1681446 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@953 -- # uname 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1681446 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1681446' 00:38:22.599 killing process with pid 1681446 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@967 -- # kill 1681446 00:38:22.599 Received shutdown signal, test time was about 5.000000 seconds 00:38:22.599 00:38:22.599 Latency(us) 00:38:22.599 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:22.599 =================================================================================================================== 00:38:22.599 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:22.599 12:19:35 chaining -- common/autotest_common.sh@972 -- # wait 1681446 00:38:22.599 12:19:36 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:38:22.599 12:19:36 chaining -- bdev/chaining.sh@170 -- # killprocess 1681446 00:38:22.599 12:19:36 chaining -- common/autotest_common.sh@948 -- # '[' -z 1681446 ']' 00:38:22.599 12:19:36 chaining -- common/autotest_common.sh@952 -- # kill -0 1681446 00:38:22.599 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 952: kill: (1681446) - No such process 00:38:22.599 12:19:36 chaining -- common/autotest_common.sh@975 -- # echo 'Process with pid 1681446 is not found' 00:38:22.599 Process with pid 1681446 is not found 00:38:22.599 12:19:36 chaining -- bdev/chaining.sh@171 -- # wait 1681446 00:38:22.599 12:19:36 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:38:22.599 12:19:36 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:38:22.599 12:19:36 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:38:22.599 12:19:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@296 -- # e810=() 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@297 -- # x722=() 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@298 -- # mlx=() 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@335 -- # (( 0 == 0 )) 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@336 -- # return 1 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@416 -- # [[ no == yes ]] 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@423 -- # [[ phy-fallback == phy ]] 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@426 -- # [[ phy-fallback == phy-fallback ]] 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@427 -- # echo 'WARNING: No supported devices were found, fallback requested for tcp test' 00:38:22.599 WARNING: No supported devices were found, fallback requested for tcp test 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@431 -- # [[ tcp == tcp ]] 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@432 -- # nvmf_veth_init 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@141 -- # NVMF_INITIATOR_IP=10.0.0.1 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@142 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@143 -- # NVMF_SECOND_TARGET_IP=10.0.0.3 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@144 -- # NVMF_BRIDGE=nvmf_br 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@145 -- # NVMF_INITIATOR_INTERFACE=nvmf_init_if 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@146 -- # NVMF_INITIATOR_BRIDGE=nvmf_init_br 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@147 -- # NVMF_TARGET_NAMESPACE=nvmf_tgt_ns_spdk 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@148 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@149 -- # NVMF_TARGET_INTERFACE=nvmf_tgt_if 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@150 -- # NVMF_TARGET_INTERFACE2=nvmf_tgt_if2 00:38:22.599 12:19:36 chaining -- nvmf/common.sh@151 -- # NVMF_TARGET_BRIDGE=nvmf_tgt_br 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@152 -- # NVMF_TARGET_BRIDGE2=nvmf_tgt_br2 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@154 -- # ip link set nvmf_init_br nomaster 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@155 -- # ip link set nvmf_tgt_br nomaster 00:38:22.600 Cannot find device "nvmf_tgt_br" 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@155 -- # true 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@156 -- # ip link set nvmf_tgt_br2 nomaster 00:38:22.600 Cannot find device "nvmf_tgt_br2" 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@156 -- # true 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@157 -- # ip link set nvmf_init_br down 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@158 -- # ip link set nvmf_tgt_br down 00:38:22.600 Cannot find device "nvmf_tgt_br" 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@158 -- # true 00:38:22.600 12:19:36 chaining -- nvmf/common.sh@159 -- # ip link set nvmf_tgt_br2 down 00:38:22.861 Cannot find device "nvmf_tgt_br2" 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@159 -- # true 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@160 -- # ip link delete nvmf_br type bridge 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@161 -- # ip link delete nvmf_init_if 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@162 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if 00:38:22.861 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@162 -- # true 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@163 -- # ip netns exec nvmf_tgt_ns_spdk ip link delete nvmf_tgt_if2 00:38:22.861 Cannot open network namespace "nvmf_tgt_ns_spdk": No such file or directory 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@163 -- # true 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@166 -- # ip netns add nvmf_tgt_ns_spdk 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@169 -- # ip link add nvmf_init_if type veth peer name nvmf_init_br 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@170 -- # ip link add nvmf_tgt_if type veth peer name nvmf_tgt_br 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@171 -- # ip link add nvmf_tgt_if2 type veth peer name nvmf_tgt_br2 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@174 -- # ip link set nvmf_tgt_if netns nvmf_tgt_ns_spdk 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@175 -- # ip link set nvmf_tgt_if2 netns nvmf_tgt_ns_spdk 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@178 -- # ip addr add 10.0.0.1/24 dev nvmf_init_if 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@179 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.2/24 dev nvmf_tgt_if 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@180 -- # ip netns exec nvmf_tgt_ns_spdk ip addr add 10.0.0.3/24 dev nvmf_tgt_if2 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@183 -- # ip link set nvmf_init_if up 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@184 -- # ip link set nvmf_init_br up 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@185 -- # ip link set nvmf_tgt_br up 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@186 -- # ip link set nvmf_tgt_br2 up 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@187 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if up 00:38:22.861 12:19:36 chaining -- nvmf/common.sh@188 -- # ip netns exec nvmf_tgt_ns_spdk ip link set nvmf_tgt_if2 up 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@189 -- # ip netns exec nvmf_tgt_ns_spdk ip link set lo up 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@192 -- # ip link add nvmf_br type bridge 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@193 -- # ip link set nvmf_br up 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@196 -- # ip link set nvmf_init_br master nvmf_br 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@197 -- # ip link set nvmf_tgt_br master nvmf_br 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@198 -- # ip link set nvmf_tgt_br2 master nvmf_br 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@201 -- # iptables -I INPUT 1 -i nvmf_init_if -p tcp --dport 4420 -j ACCEPT 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@202 -- # iptables -A FORWARD -i nvmf_br -o nvmf_br -j ACCEPT 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@205 -- # ping -c 1 10.0.0.2 00:38:23.121 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:38:23.121 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.110 ms 00:38:23.121 00:38:23.121 --- 10.0.0.2 ping statistics --- 00:38:23.121 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:23.121 rtt min/avg/max/mdev = 0.110/0.110/0.110/0.000 ms 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@206 -- # ping -c 1 10.0.0.3 00:38:23.121 PING 10.0.0.3 (10.0.0.3) 56(84) bytes of data. 00:38:23.121 64 bytes from 10.0.0.3: icmp_seq=1 ttl=64 time=0.085 ms 00:38:23.121 00:38:23.121 --- 10.0.0.3 ping statistics --- 00:38:23.121 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:23.121 rtt min/avg/max/mdev = 0.085/0.085/0.085/0.000 ms 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@207 -- # ip netns exec nvmf_tgt_ns_spdk ping -c 1 10.0.0.1 00:38:23.121 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:38:23.121 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.038 ms 00:38:23.121 00:38:23.121 --- 10.0.0.1 ping statistics --- 00:38:23.121 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:38:23.121 rtt min/avg/max/mdev = 0.038/0.038/0.038/0.000 ms 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@209 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@433 -- # return 0 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:38:23.121 12:19:36 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:38:23.121 12:19:36 chaining -- common/autotest_common.sh@722 -- # xtrace_disable 00:38:23.121 12:19:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@481 -- # nvmfpid=1682590 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@480 -- # ip netns exec nvmf_tgt_ns_spdk ip netns exec nvmf_tgt_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:38:23.121 12:19:36 chaining -- nvmf/common.sh@482 -- # waitforlisten 1682590 00:38:23.121 12:19:36 chaining -- common/autotest_common.sh@829 -- # '[' -z 1682590 ']' 00:38:23.121 12:19:36 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/spdk.sock 00:38:23.121 12:19:36 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:23.121 12:19:36 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:38:23.121 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:38:23.121 12:19:36 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:23.121 12:19:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:23.380 [2024-07-15 12:19:36.821085] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:23.380 [2024-07-15 12:19:36.821221] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:38:23.639 [2024-07-15 12:19:37.038971] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:23.639 [2024-07-15 12:19:37.173261] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:38:23.639 [2024-07-15 12:19:37.173322] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:38:23.639 [2024-07-15 12:19:37.173340] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:38:23.639 [2024-07-15 12:19:37.173357] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:38:23.639 [2024-07-15 12:19:37.173371] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:38:23.639 [2024-07-15 12:19:37.173407] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:38:24.207 12:19:37 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:24.207 12:19:37 chaining -- common/autotest_common.sh@862 -- # return 0 00:38:24.207 12:19:37 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:38:24.207 12:19:37 chaining -- common/autotest_common.sh@728 -- # xtrace_disable 00:38:24.207 12:19:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:24.207 12:19:37 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:38:24.207 12:19:37 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:38:24.207 12:19:37 chaining -- common/autotest_common.sh@559 -- # xtrace_disable 00:38:24.207 12:19:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:24.207 malloc0 00:38:24.207 [2024-07-15 12:19:37.783927] tcp.c: 672:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:38:24.207 [2024-07-15 12:19:37.800213] tcp.c: 967:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:38:24.466 12:19:37 chaining -- common/autotest_common.sh@587 -- # [[ 0 == 0 ]] 00:38:24.466 12:19:37 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:38:24.466 12:19:37 chaining -- bdev/chaining.sh@189 -- # bperfpid=1682783 00:38:24.466 12:19:37 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:38:24.466 12:19:37 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1682783 /var/tmp/bperf.sock 00:38:24.466 12:19:37 chaining -- common/autotest_common.sh@829 -- # '[' -z 1682783 ']' 00:38:24.466 12:19:37 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:38:24.466 12:19:37 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:24.466 12:19:37 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:38:24.466 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:38:24.466 12:19:37 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:24.466 12:19:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:24.466 [2024-07-15 12:19:37.876343] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:24.466 [2024-07-15 12:19:37.876411] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1682783 ] 00:38:24.466 [2024-07-15 12:19:38.006382] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:24.725 [2024-07-15 12:19:38.107799] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:25.662 12:19:39 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:25.662 12:19:39 chaining -- common/autotest_common.sh@862 -- # return 0 00:38:25.662 12:19:39 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:38:25.662 12:19:39 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:38:25.921 [2024-07-15 12:19:39.485428] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:38:25.921 nvme0n1 00:38:25.921 true 00:38:25.921 crypto0 00:38:25.921 12:19:39 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:38:26.180 Running I/O for 5 seconds... 00:38:31.454 00:38:31.454 Latency(us) 00:38:31.454 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:31.454 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:38:31.454 Verification LBA range: start 0x0 length 0x2000 00:38:31.454 crypto0 : 5.03 6923.43 27.04 0.00 0.00 36845.80 4559.03 27126.21 00:38:31.454 =================================================================================================================== 00:38:31.454 Total : 6923.43 27.04 0.00 0.00 36845.80 4559.03 27126.21 00:38:31.454 0 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@205 -- # sequence=69622 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:31.454 12:19:44 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@206 -- # encrypt=34811 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:31.713 12:19:45 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@207 -- # decrypt=34811 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:31.972 12:19:45 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:38:32.231 12:19:45 chaining -- bdev/chaining.sh@208 -- # crc32c=69622 00:38:32.231 12:19:45 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:38:32.231 12:19:45 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:38:32.231 12:19:45 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:38:32.231 12:19:45 chaining -- bdev/chaining.sh@214 -- # killprocess 1682783 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@948 -- # '[' -z 1682783 ']' 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@952 -- # kill -0 1682783 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@953 -- # uname 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1682783 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1682783' 00:38:32.231 killing process with pid 1682783 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@967 -- # kill 1682783 00:38:32.231 Received shutdown signal, test time was about 5.000000 seconds 00:38:32.231 00:38:32.231 Latency(us) 00:38:32.231 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:32.231 =================================================================================================================== 00:38:32.231 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:32.231 12:19:45 chaining -- common/autotest_common.sh@972 -- # wait 1682783 00:38:32.489 12:19:46 chaining -- bdev/chaining.sh@219 -- # bperfpid=1683740 00:38:32.489 12:19:46 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:38:32.489 12:19:46 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1683740 /var/tmp/bperf.sock 00:38:32.489 12:19:46 chaining -- common/autotest_common.sh@829 -- # '[' -z 1683740 ']' 00:38:32.489 12:19:46 chaining -- common/autotest_common.sh@833 -- # local rpc_addr=/var/tmp/bperf.sock 00:38:32.489 12:19:46 chaining -- common/autotest_common.sh@834 -- # local max_retries=100 00:38:32.489 12:19:46 chaining -- common/autotest_common.sh@836 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:38:32.489 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:38:32.489 12:19:46 chaining -- common/autotest_common.sh@838 -- # xtrace_disable 00:38:32.489 12:19:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:32.489 [2024-07-15 12:19:46.078531] Starting SPDK v24.09-pre git sha1 2728651ee / DPDK 24.03.0 initialization... 00:38:32.489 [2024-07-15 12:19:46.078611] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1683740 ] 00:38:32.747 [2024-07-15 12:19:46.210936] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:38:32.747 [2024-07-15 12:19:46.312109] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:38:33.315 12:19:46 chaining -- common/autotest_common.sh@858 -- # (( i == 0 )) 00:38:33.316 12:19:46 chaining -- common/autotest_common.sh@862 -- # return 0 00:38:33.316 12:19:46 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:38:33.316 12:19:46 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:38:33.889 [2024-07-15 12:19:47.297945] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:38:33.889 nvme0n1 00:38:33.889 true 00:38:33.889 crypto0 00:38:33.889 12:19:47 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:38:33.889 Running I/O for 5 seconds... 00:38:39.186 00:38:39.186 Latency(us) 00:38:39.186 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:39.186 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:38:39.186 Verification LBA range: start 0x0 length 0x200 00:38:39.186 crypto0 : 5.01 1642.65 102.67 0.00 0.00 19102.49 869.06 20401.64 00:38:39.186 =================================================================================================================== 00:38:39.186 Total : 1642.65 102.67 0.00 0.00 19102.49 869.06 20401.64 00:38:39.186 0 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@233 -- # sequence=16450 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:39.186 12:19:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:38:39.445 12:19:52 chaining -- bdev/chaining.sh@234 -- # encrypt=8225 00:38:39.445 12:19:52 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:38:39.445 12:19:52 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:38:39.446 12:19:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:39.446 12:19:52 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:39.446 12:19:52 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:38:39.446 12:19:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:39.446 12:19:52 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:38:39.446 12:19:52 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:39.446 12:19:52 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:39.446 12:19:52 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@235 -- # decrypt=8225 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:38:39.705 12:19:53 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:38:39.963 12:19:53 chaining -- bdev/chaining.sh@236 -- # crc32c=16450 00:38:39.963 12:19:53 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:38:39.963 12:19:53 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:38:39.963 12:19:53 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:38:39.963 12:19:53 chaining -- bdev/chaining.sh@242 -- # killprocess 1683740 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@948 -- # '[' -z 1683740 ']' 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@952 -- # kill -0 1683740 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@953 -- # uname 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1683740 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_0 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@958 -- # '[' reactor_0 = sudo ']' 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1683740' 00:38:39.963 killing process with pid 1683740 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@967 -- # kill 1683740 00:38:39.963 Received shutdown signal, test time was about 5.000000 seconds 00:38:39.963 00:38:39.963 Latency(us) 00:38:39.963 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:38:39.963 =================================================================================================================== 00:38:39.963 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:38:39.963 12:19:53 chaining -- common/autotest_common.sh@972 -- # wait 1683740 00:38:40.222 12:19:53 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@117 -- # sync 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@120 -- # set +e 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:38:40.222 rmmod nvme_tcp 00:38:40.222 rmmod nvme_fabrics 00:38:40.222 rmmod nvme_keyring 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@124 -- # set -e 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@125 -- # return 0 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@489 -- # '[' -n 1682590 ']' 00:38:40.222 12:19:53 chaining -- nvmf/common.sh@490 -- # killprocess 1682590 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@948 -- # '[' -z 1682590 ']' 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@952 -- # kill -0 1682590 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@953 -- # uname 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@953 -- # '[' Linux = Linux ']' 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@954 -- # ps --no-headers -o comm= 1682590 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@954 -- # process_name=reactor_1 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@958 -- # '[' reactor_1 = sudo ']' 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@966 -- # echo 'killing process with pid 1682590' 00:38:40.222 killing process with pid 1682590 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@967 -- # kill 1682590 00:38:40.222 12:19:53 chaining -- common/autotest_common.sh@972 -- # wait 1682590 00:38:40.789 12:19:54 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:38:40.789 12:19:54 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:38:40.789 12:19:54 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:38:40.789 12:19:54 chaining -- nvmf/common.sh@274 -- # [[ nvmf_tgt_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:38:40.789 12:19:54 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:38:40.789 12:19:54 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:38:40.789 12:19:54 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:38:40.789 12:19:54 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:38:40.789 12:19:54 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush nvmf_init_if 00:38:40.789 12:19:54 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:38:40.789 00:38:40.789 real 0m46.676s 00:38:40.789 user 1m0.364s 00:38:40.789 sys 0m13.659s 00:38:40.789 12:19:54 chaining -- common/autotest_common.sh@1124 -- # xtrace_disable 00:38:40.790 12:19:54 chaining -- common/autotest_common.sh@10 -- # set +x 00:38:40.790 ************************************ 00:38:40.790 END TEST chaining 00:38:40.790 ************************************ 00:38:40.790 12:19:54 -- common/autotest_common.sh@1142 -- # return 0 00:38:40.790 12:19:54 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:38:40.790 12:19:54 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:38:40.790 12:19:54 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:38:40.790 12:19:54 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:38:40.790 12:19:54 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:38:40.790 12:19:54 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:38:40.790 12:19:54 -- common/autotest_common.sh@722 -- # xtrace_disable 00:38:40.790 12:19:54 -- common/autotest_common.sh@10 -- # set +x 00:38:40.790 12:19:54 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:38:40.790 12:19:54 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:38:40.790 12:19:54 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:38:40.790 12:19:54 -- common/autotest_common.sh@10 -- # set +x 00:38:46.062 INFO: APP EXITING 00:38:46.062 INFO: killing all VMs 00:38:46.062 INFO: killing vhost app 00:38:46.062 INFO: EXIT DONE 00:38:49.454 Waiting for block devices as requested 00:38:49.454 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:38:49.454 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:38:49.454 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:38:49.454 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:38:49.454 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:38:49.454 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:38:49.713 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:38:49.713 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:38:49.713 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:38:49.970 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:38:49.970 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:38:49.970 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:38:50.228 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:38:50.228 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:38:50.228 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:38:50.487 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:38:50.487 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:38:54.678 Cleaning 00:38:54.678 Removing: /var/run/dpdk/spdk0/config 00:38:54.678 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:38:54.678 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:38:54.678 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:38:54.678 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:38:54.678 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:38:54.678 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:38:54.678 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:38:54.678 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:38:54.678 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:38:54.678 Removing: /var/run/dpdk/spdk0/hugepage_info 00:38:54.678 Removing: /dev/shm/nvmf_trace.0 00:38:54.678 Removing: /dev/shm/spdk_tgt_trace.pid1409051 00:38:54.678 Removing: /var/run/dpdk/spdk0 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1404114 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1406810 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1409051 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1409591 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1410362 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1410666 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1411421 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1411601 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1411883 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1419043 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1421270 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1421500 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1421751 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1422154 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1422395 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1422647 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1422888 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1423175 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1423924 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1426521 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1426756 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1427051 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1427272 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1427303 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1427525 00:38:54.678 Removing: /var/run/dpdk/spdk_pid1427727 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1427920 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1428162 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1428477 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1428675 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1428874 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1429069 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1429268 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1429523 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1429819 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1430025 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1430217 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1430419 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1430610 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1430873 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1431166 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1431371 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1431572 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1431768 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1431964 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1432326 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1432604 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1432910 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1433377 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1433903 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1434349 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1434707 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1435074 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1435151 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1435558 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1435997 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1436246 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1436430 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1440735 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1442378 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1443977 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1444865 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1445961 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1446306 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1446412 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1446517 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1450307 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1450857 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1451760 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1452105 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1457360 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1459086 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1460528 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1464786 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1466428 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1467402 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1471821 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1474259 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1475226 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1484990 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1487714 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1488699 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1498788 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1500910 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1501932 00:38:54.938 Removing: /var/run/dpdk/spdk_pid1511693 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1515525 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1516582 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1527299 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1529787 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1530946 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1542717 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1545144 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1546295 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1557368 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1561355 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1562376 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1563860 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1567916 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1573105 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1575446 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1580112 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1583502 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1588978 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1591753 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1598756 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1601339 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1607406 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1609881 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1616185 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1618989 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1623946 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1624305 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1624658 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1625014 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1625478 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1626221 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1626906 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1627348 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1630036 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1632798 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1635437 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1637733 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1640269 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1643304 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1646029 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1648348 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1648915 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1649289 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1651454 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1653301 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1655113 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1656139 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1657278 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1657818 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1657989 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1658079 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1658367 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1658467 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1659708 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1661217 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1662718 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1663458 00:38:55.199 Removing: /var/run/dpdk/spdk_pid1664314 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1664511 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1664603 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1664723 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1665631 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1666203 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1666632 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1669268 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1671121 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1672956 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1674014 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1675085 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1675630 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1675814 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1679704 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1679915 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1680083 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1680144 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1680354 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1680570 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1681446 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1682783 00:38:55.459 Removing: /var/run/dpdk/spdk_pid1683740 00:38:55.459 Clean 00:38:55.459 12:20:09 -- common/autotest_common.sh@1451 -- # return 0 00:38:55.459 12:20:09 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:38:55.459 12:20:09 -- common/autotest_common.sh@728 -- # xtrace_disable 00:38:55.459 12:20:09 -- common/autotest_common.sh@10 -- # set +x 00:38:55.459 12:20:09 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:38:55.459 12:20:09 -- common/autotest_common.sh@728 -- # xtrace_disable 00:38:55.459 12:20:09 -- common/autotest_common.sh@10 -- # set +x 00:38:55.718 12:20:09 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:38:55.718 12:20:09 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:38:55.718 12:20:09 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:38:55.718 12:20:09 -- spdk/autotest.sh@391 -- # hash lcov 00:38:55.718 12:20:09 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:38:55.718 12:20:09 -- spdk/autotest.sh@393 -- # hostname 00:38:55.718 12:20:09 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-40 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:38:55.718 geninfo: WARNING: invalid characters removed from testname! 00:39:17.655 12:20:28 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:19.030 12:20:32 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:21.559 12:20:35 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:24.089 12:20:37 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:27.372 12:20:40 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:28.751 12:20:41 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:39:30.127 12:20:43 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:39:30.389 12:20:43 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:39:30.389 12:20:43 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:39:30.389 12:20:43 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:39:30.389 12:20:43 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:39:30.389 12:20:43 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:30.389 12:20:43 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:30.389 12:20:43 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:30.389 12:20:43 -- paths/export.sh@5 -- $ export PATH 00:39:30.389 12:20:43 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:39:30.389 12:20:43 -- common/autobuild_common.sh@443 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:39:30.389 12:20:43 -- common/autobuild_common.sh@444 -- $ date +%s 00:39:30.389 12:20:43 -- common/autobuild_common.sh@444 -- $ mktemp -dt spdk_1721038843.XXXXXX 00:39:30.389 12:20:43 -- common/autobuild_common.sh@444 -- $ SPDK_WORKSPACE=/tmp/spdk_1721038843.MjbqsA 00:39:30.389 12:20:43 -- common/autobuild_common.sh@446 -- $ [[ -n '' ]] 00:39:30.389 12:20:43 -- common/autobuild_common.sh@450 -- $ '[' -n '' ']' 00:39:30.389 12:20:43 -- common/autobuild_common.sh@453 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:39:30.389 12:20:43 -- common/autobuild_common.sh@457 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:39:30.389 12:20:43 -- common/autobuild_common.sh@459 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:39:30.389 12:20:43 -- common/autobuild_common.sh@460 -- $ get_config_params 00:39:30.389 12:20:43 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:39:30.389 12:20:43 -- common/autotest_common.sh@10 -- $ set +x 00:39:30.389 12:20:43 -- common/autobuild_common.sh@460 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:39:30.389 12:20:43 -- common/autobuild_common.sh@462 -- $ start_monitor_resources 00:39:30.389 12:20:43 -- pm/common@17 -- $ local monitor 00:39:30.389 12:20:43 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:30.389 12:20:43 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:30.389 12:20:43 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:30.389 12:20:43 -- pm/common@21 -- $ date +%s 00:39:30.389 12:20:43 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:30.389 12:20:43 -- pm/common@21 -- $ date +%s 00:39:30.389 12:20:43 -- pm/common@25 -- $ sleep 1 00:39:30.389 12:20:43 -- pm/common@21 -- $ date +%s 00:39:30.389 12:20:43 -- pm/common@21 -- $ date +%s 00:39:30.389 12:20:43 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721038843 00:39:30.389 12:20:43 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721038843 00:39:30.389 12:20:43 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721038843 00:39:30.389 12:20:43 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721038843 00:39:30.389 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721038843_collect-vmstat.pm.log 00:39:30.389 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721038843_collect-cpu-load.pm.log 00:39:30.389 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721038843_collect-cpu-temp.pm.log 00:39:30.389 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721038843_collect-bmc-pm.bmc.pm.log 00:39:31.327 12:20:44 -- common/autobuild_common.sh@463 -- $ trap stop_monitor_resources EXIT 00:39:31.327 12:20:44 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j72 00:39:31.327 12:20:44 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:39:31.327 12:20:44 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:39:31.327 12:20:44 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:39:31.327 12:20:44 -- spdk/autopackage.sh@19 -- $ timing_finish 00:39:31.327 12:20:44 -- common/autotest_common.sh@734 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:39:31.327 12:20:44 -- common/autotest_common.sh@735 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:39:31.328 12:20:44 -- common/autotest_common.sh@737 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:39:31.328 12:20:44 -- spdk/autopackage.sh@20 -- $ exit 0 00:39:31.328 12:20:44 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:39:31.328 12:20:44 -- pm/common@29 -- $ signal_monitor_resources TERM 00:39:31.328 12:20:44 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:39:31.328 12:20:44 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:31.328 12:20:44 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:39:31.328 12:20:44 -- pm/common@44 -- $ pid=1694800 00:39:31.328 12:20:44 -- pm/common@50 -- $ kill -TERM 1694800 00:39:31.328 12:20:44 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:31.328 12:20:44 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:39:31.328 12:20:44 -- pm/common@44 -- $ pid=1694802 00:39:31.328 12:20:44 -- pm/common@50 -- $ kill -TERM 1694802 00:39:31.328 12:20:44 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:31.328 12:20:44 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:39:31.328 12:20:44 -- pm/common@44 -- $ pid=1694804 00:39:31.328 12:20:44 -- pm/common@50 -- $ kill -TERM 1694804 00:39:31.328 12:20:44 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:39:31.328 12:20:44 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:39:31.328 12:20:44 -- pm/common@44 -- $ pid=1694828 00:39:31.328 12:20:44 -- pm/common@50 -- $ sudo -E kill -TERM 1694828 00:39:31.328 + [[ -n 1291144 ]] 00:39:31.328 + sudo kill 1291144 00:39:31.597 [Pipeline] } 00:39:31.615 [Pipeline] // stage 00:39:31.620 [Pipeline] } 00:39:31.635 [Pipeline] // timeout 00:39:31.641 [Pipeline] } 00:39:31.656 [Pipeline] // catchError 00:39:31.661 [Pipeline] } 00:39:31.681 [Pipeline] // wrap 00:39:31.688 [Pipeline] } 00:39:31.703 [Pipeline] // catchError 00:39:31.711 [Pipeline] stage 00:39:31.713 [Pipeline] { (Epilogue) 00:39:31.726 [Pipeline] catchError 00:39:31.728 [Pipeline] { 00:39:31.745 [Pipeline] echo 00:39:31.748 Cleanup processes 00:39:31.758 [Pipeline] sh 00:39:32.046 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:39:32.046 1694904 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:39:32.046 1695122 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:39:32.064 [Pipeline] sh 00:39:32.445 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:39:32.445 ++ grep -v 'sudo pgrep' 00:39:32.445 ++ awk '{print $1}' 00:39:32.445 + sudo kill -9 1694904 00:39:32.456 [Pipeline] sh 00:39:32.738 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:39:44.954 [Pipeline] sh 00:39:45.235 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:39:45.235 Artifacts sizes are good 00:39:45.252 [Pipeline] archiveArtifacts 00:39:45.260 Archiving artifacts 00:39:45.388 [Pipeline] sh 00:39:45.670 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:39:45.685 [Pipeline] cleanWs 00:39:45.694 [WS-CLEANUP] Deleting project workspace... 00:39:45.694 [WS-CLEANUP] Deferred wipeout is used... 00:39:45.701 [WS-CLEANUP] done 00:39:45.703 [Pipeline] } 00:39:45.726 [Pipeline] // catchError 00:39:45.738 [Pipeline] sh 00:39:46.020 + logger -p user.info -t JENKINS-CI 00:39:46.030 [Pipeline] } 00:39:46.046 [Pipeline] // stage 00:39:46.052 [Pipeline] } 00:39:46.072 [Pipeline] // node 00:39:46.078 [Pipeline] End of Pipeline 00:39:46.120 Finished: SUCCESS